-
University of Queensland
- Brisbane, Australia
-
13:07
(UTC +10:00) - https://jameszhou-gl.github.io
Lists (6)
Sort Name ascending (A-Z)
Stars
SmolLM2: A tiny yet mighty 147M LLaMA-style model with RoPE, GQA, and SwiGLU — built for speed and efficiency."
LLaMA 2 implemented from scratch in PyTorch
gpt-oss-120b and gpt-oss-20b are two open-weight language models by OpenAI
Code for the paper "ClinicalBench: Can LLMs Beat Traditional ML Models in Clinical Prediction?"
[Nature Communications] The official code for "Quantifying the Reasoning Abilities of LLMs on Real-world Clinical Cases".
Code repository for the framework to engage in clinical decision making task using the MIMIC-CDM dataset.
Code repository to create the MIMIC-CDM Dataset.
[EMNLP'24] EHRAgent: Code Empowers Large Language Models for Complex Tabular Reasoning on Electronic Health Records
A Paper collection for LLM based Patient Simulators
Code for ECAI'25-Generating Clinically Realistic EHR Data via a Hierarchy- and Semantics-Guided Transformer
repository for Publicly Available Clinical BERT Embeddings
Simple, minimal implementation of the Mamba SSM in one file of PyTorch.
Code for TMM-HCVP: Leveraging Hierarchical Contrastive Visual Prompt for Domain Generalization
Open Synthea patient data for machine learning and team training.
Generating synthetic Electronic Health Records using continuous-time diffusion models.
A benchmark for few-shot evaluation of foundation models for electronic health records (EHRs)
Code repository for the paper : Causal thinking for decision making on Electronic Health Records: why and how
EMNLP'22 | PromptEHR: Conditional Electronic Healthcare Records Generation with Prompt Learning
PyTrial: A Comprehensive Platform for Artificial Intelligence for Drug Development
Med-BERT, contextualized embedding model for structured EHR data
Deep Generative Modelling of Patient Timelines using Electronic Health Records
Code for BEHRT: Transformer for Electronic Health Records
A simple and efficient Mamba implementation in pure PyTorch and MLX.