pytorch-fsdp

Expert guidance for Fully Sharded Data Parallel training with PyTorch FSDP - parameter sharding, mixed precision, CPU offloading, FSDP2

$ Installieren

git clone https://github.com/zechenzhangAGI/AI-research-SKILLs /tmp/AI-research-SKILLs && cp -r /tmp/AI-research-SKILLs/08-distributed-training/pytorch-fsdp ~/.claude/skills/AI-research-SKILLs

// tip: Run this command in your terminal to install the skill