HTTP/2 301
server: GitHub.com
content-type: text/html
location: https://daizedong.github.io/projects/
access-control-allow-origin: *
strict-transport-security: max-age=31556952
expires: Wed, 31 Dec 2025 00:35:28 GMT
cache-control: max-age=600
x-proxy-cache: MISS
x-github-request-id: 6E68:2BC55:A94221:BE509E:69546D77
accept-ranges: bytes
age: 0
date: Wed, 31 Dec 2025 00:25:28 GMT
via: 1.1 varnish
x-served-by: cache-bom-vanm7210030-BOM
x-cache: MISS
x-cache-hits: 0
x-timer: S1767140728.961259,VS0,VE223
vary: Accept-Encoding
x-fastly-request-id: 1c264e5dddb544aabd42f502cd979e63e9c9be39
content-length: 162
HTTP/2 200
server: GitHub.com
content-type: text/html; charset=utf-8
last-modified: Sat, 13 Dec 2025 08:53:11 GMT
access-control-allow-origin: *
strict-transport-security: max-age=31556952
etag: W/"693d2977-2377"
expires: Wed, 31 Dec 2025 00:35:28 GMT
cache-control: max-age=600
content-encoding: gzip
x-proxy-cache: MISS
x-github-request-id: 8323:2B0FD4:A9BC8E:BEC924:69546D77
accept-ranges: bytes
age: 0
date: Wed, 31 Dec 2025 00:25:28 GMT
via: 1.1 varnish
x-served-by: cache-bom-vanm7210030-BOM
x-cache: MISS
x-cache-hits: 0
x-timer: S1767140728.198282,VS0,VE233
vary: Accept-Encoding
x-fastly-request-id: bda336b3da96d6fb2f9968fda4be678875e74aca
content-length: 3364
Projects - Daize Dong You can find more of my projects on my GitHub Page.
Deep Learning
LLaMA-MoE: Building Mixture-of-Experts from LLaMA with Continual Pre-training [GitHub] | [Paper]
A series of open-sourced Mixture-of-Expert (MoE) models based on LLaMA 2 and SlimPajama.
LLaMA-MoE v2: Exploring Sparsity of LLaMA from Perspective of Mixture-of-Experts with Post-Training [GitHub] | [Paper]
A series of open-sourced Mixture-of-Expert (MoE) models based on LLaMA 3.
Other Stuff
ChatGPT ArXiv Paper Assistant [GitHub]
A ChatGPT/Gemini/DeepSeek based personalized arXiv paper assistant bot for automatic paper filtering.
Easier PS and SoP [GitHub]
A LaTeX template for writing Personal Statements (PS) and Statements of Purpose (SoP) for graduate school applications.