1 line
689 B
Plaintext
Raw Normal View History

2025-01-19 00:19:00 +00:00
trekhleb said @retoor I've tried to train ~80M GPT parameters on a single GPU in the browser so far. Pretty heavy. It is interesting to see how 1.5B parameter will behave...,@retoor I'm not sure, it probably depends on the model configuration/implementation and equipment. But in the browser, for that "homemade GPT", I see that training on WebGPU is around x100 - x1000 times faster than CPU,Yeah, the jobs list needs some automated cleanup for sure. Location might be incorrect as well as some jobs are not actually technical. As I mentioned it is the first (draft) iteration of the project, so there is some work to do,@retoor Yes, all GoogleDomains are registered on Squarespace now```