Score:0

How to size compute/gpu/storage/network for generative AI or LLM?

nu flag

I would like to provision compute (servers), gpus (say 2 A100 80GB or H100), storage and network (may be 100GbE) to run OpenApaca 7B (https://huggingface.co/openlm-research/open_llama_7b) model.

How do I go about sizing this? AWS/GCP cluster sizing is okay too.

techele avatar
nu flag
Zac, it does not.
techele avatar
nu flag
So this model has 1 trillion tokens, say ~100GB disk space (it is actually less), however for sake of argument let ~1TB with redundancy etc. Now I need to run some containerized neural net code (what is preferred containers or bare metal) 1. What should be the type of CPU, RAM, GPU for each node? 2. How many containers per node? 3. How many such nodes? 4. What about HA, DP and DR in this case? 5. How much time (expected) for model to train? 6. What could be typical response time for inferencing? 7. Network considerations? 8. How will this model scale?
mfinni avatar
cn flag
The linked canonical answer is that YOU have to benchmark YOUR code (even if it's not code you wrote, it's the code you chose). Hopefully the creator of this package has some sizing guidelines, or will be responsive to you reaching out with these questions. No one here is likely to have a ready answer for you.
mfinni avatar
cn flag
There's SOME anecdata in the comments on this thread. You're going to have to do your own research on this one. https://news.ycombinator.com/item?id=35798888
I sit in a Tesla and translated this thread with Ai:

mangohost

Post an answer

Most people don’t grasp that asking a lot of questions unlocks learning and improves interpersonal bonding. In Alison’s studies, for example, though people could accurately recall how many questions had been asked in their conversations, they didn’t intuit the link between questions and liking. Across four studies, in which participants were engaged in conversations themselves or read transcripts of others’ conversations, people tended not to realize that question asking would influence—or had influenced—the level of amity between the conversationalists.