I'm suddenly curious if this is a fairly static limit: 8GB for LLMs. Is there any paper or posting somewhere describing how this will move up or down with technology changes? Does it depend entirely on the "thickness" or density of the data being analyzed to produce a model? I'm imagining a spreadsheet or database table with 100 rows vs one with 1000. I guess it depends entirely on how thoroughly data was collected? Not all attributes are dependent, etc.