White Papers
47 CheXNet – Inference with Nvidia T4 on Dell EMC PowerEdge R7425
B References
[1] P. Rajpurkar, J. Irvin, K. Zhu, B. Yang, H. Mehta, T. Duan, D. Ding, A. Bagul, C. Langlotz, K.
Shpanskaya et al., “CheXNet: Radiologist-Level Pneumonia Detection on Chest X-Rays with
Deep Learning,” arXiv preprint arXiv:1711.05225, 2017 [Online]. Available:
https://arxiv.org/abs/1711.05225
[2] Nvidia News Center, ”NVIDIA AI Inference Performance Milestones: Delivering Leading
Throughput, Latency and Efficiency” [Online]. Available: https://news.developer.nvidia.com/nvidia-
ai-inference-performance-milestones-delivering-leading-throughput-latency-and-efficiency/
[3] TensorFlow Guide “tf.estimator.Estimator, Estimator”[Online]. Available:
https://www.tensorflow.org/guide/estimators
[4] TensorFlow Guide, “Creating Custom Estimators” [Online]. Available:
https://www.tensorflow.org/guide/custom_estimators
[5] “Multithreaded predictions with TensorFlow Estimators.” [Online]. Available:
https://medium.com/element-ai-research-lab/multithreaded-predictions-with-tensorflow-estimators-
eb041861da07
[6] TensorFlow Official Models, “ResNet in TensorFlow“ [Online]. Available:
https://github.com/tensorflow/models/tree/master/official/resnet
[7] TensorFlow Guide, “Checkpoints. Restoring your model”. [Online]. Available:
https://www.tensorflow.org/guide/checkpoints#checkpointing_frequency
[8] Stackoverflow, “Variables. What's the difference of name scope and a variable scope in
tensorflow?” [Online]. Available: https://stackoverflow.com/questions/35919020/whats-the-
difference-of-name-scope-and-a-variable-scope-in-tensorflow . TensorFlow Guide “Sharing
variables” [Online]. Available: https://www.tensorflow.org/guide/variables#sharing_variables
[9] TensorFlow API , “Exports inference graph as a SavedModel” . [Online]. Available:
https://www.tensorflow.org/api_docs/python/tf/estimator/Estimator#export_savedmodel
[10] Medium, “Serving Pre-Modeled and Custom Tensorflow Estimator with Tensorflow Serving“.
[Online]. Available: https://medium.com/@yuu.ishikawa/serving-pre-modeled-and-custom-
tensorflow-estimator-with-tensorflow-serving-12833b4be421
[11] Nvidia, “Accelerating Inference In TensorFlow With TensorRT™ User Guide”. [Online].
Available: https://docs.nvidia.com/deeplearning/dgx/integrate-tf-trt/index.html
[12] Nvidia, “Using TF-TRT. Accelerating Inference In TensorFlow With TensorRT™ User Guide” .
[Online]. Available: https://docs.nvidia.com/deeplearning/dgx/integrate-tf-trt/index.html#usingtftrt
[13] Nvidia, “Accelerating Inference In TensorFlow With TensorRT™ User Guide”, “Supported
Ops” [Online]. Available: https://docs.nvidia.com/deeplearning/dgx/integrate-tf-
trt/index.html#support-ops . “Working With Deep Learning Frameworks” [Online].Available:
https://docs.nvidia.com/deeplearning/sdk/TensorRT™-developer-guide/index.html#build_model