Wrong shape of allowed in LDA example?

Hi, I had a doubt regarding the LDA example. In the “documents” plate of the model, the shape of doc_topics is (num_docs, num_topics) i.e. (1000, 8). How is this possible? Being in this plate, the documents should be on axis -1, surely? Yet it runs, but any modification I make based on this breaks with a shape mismatch, complaining that 1000 must be the last axis. What gives?

Hi @pyro.technician, in this example num_docs is the size of a batch dimension (at batch position -1) and num_topics is the size of an event dimension (of a Dirichlet random variable). You can read more about the differences between batch dimensions and event dimension in the tensor shapes tutorial. The most important fact is that batch dimensions are always to the left of event dimensions.

1 Like