Skip to content
View hongzhi-gao's full-sized avatar
🎯
Focusing
🎯
Focusing

Block or report hongzhi-gao

Block user

Prevent this user from interacting with your repositories and sending you notifications. Learn more about blocking users.

You must be logged in to block users.

Maximum 250 characters. Please don’t include any personal information such as legal names or email addresses. Markdown is supported. This note will only be visible to you.
Report abuse

Contact GitHub support about this user’s behavior. Learn more about reporting abuse.

Report abuse
hongzhi-gao/README.md

Hongzhi Gao

Data Infrastructure / Data Engineering Engineer
M.S. in Software Engineering, Zhejiang University

Tech: C++ / Java / Go / Python · Distributed Systems · Storage Engines

📫 hongzhigao@apache.org


Recent Activity

Friday, April 24, 2026, 02:53:22 Beijing (UTC+8)

  1. ⬆️ Pushed to hongzhi-gao/vllm-omni on feature/sd3-fp8: feature sd3.0 fp8 quantization
  2. ⬆️ Pushed to hongzhi-gao/vllm-omni on feature/sd3-fp8: Merge branch 'main' into feature/sd3-fp8
  3. ⬆️ Pushed to hongzhi-gao/vllm-omni on feature/sd3-fp8: Merge branch 'main' into feature/sd3-fp8
  4. ⬆️ Pushed to hongzhi-gao/vllm-omni on feature/sd3-fp8: feature sd3.0 fp8 quantization
  5. ⬆️ Pushed to hongzhi-gao/vllm-omni on feature/sd3-fp8: Merge branch 'main' into feature/sd3-fp8
  6. 💪 Opened PR #3061 in vllm-project/vllm-omni
  7. ⬆️ Pushed to hongzhi-gao/vllm-omni on feature/sd3-fp8: feature sd3.0 fp8 quantization
  8. 🔁 merged PR #17347 in apache/iotdb
  9. ❌ Closed PR #17544 in apache/iotdb
  10. 🔁 reopened PR #17347 in apache/iotdb

Popular repositories Loading

  1. tsfile tsfile Public

    Forked from apache/tsfile

    Apache TsFile

    Java

  2. iotdb iotdb Public

    Forked from apache/iotdb

    Apache IoTDB

    Java

  3. hongzhi-gao hongzhi-gao Public

    Python

  4. vllm vllm Public

    Forked from vllm-project/vllm

    A high-throughput and memory-efficient inference and serving engine for LLMs

    Python

  5. vllm-omni vllm-omni Public

    Forked from vllm-project/vllm-omni

    A framework for efficient model inference with omni-modality models

    Python