chronos embeddings shape #267
Replies: 3 comments
-
So chronos gives a representation for each time stamp and series you provide, see here for details. Do note that |
Beta Was this translation helpful? Give feedback.
-
Thank you @StatMixedML for the reply. @xabiersuarez if you would like one embedding, you can average as suggested by @StatMixedML or you can take the embedding of the last token which should serve as a sort of global representation of the series. One of these options may work better for different downstream use cases. |
Beta Was this translation helpful? Give feedback.
-
Thanks!!👌🏼 |
Beta Was this translation helpful? Give feedback.
-
Hi, I have a doubt about the use/output of embeddings with amazon pre-trained models in chronos. I have a dataset about daily sales (1941 days) of products that has 84 different series ('ID').
My idea was to get the embeddings of these series.
To get the embedding of one of the series I have done the following:
Example series1
df = pd.read_csv(...)
model = ChronosPipeline.from_pretrained(
“amazon/chronos-t5-tiny”,
device_map=“cpu”,
torch_dtype=torch.bfloat16,)
context = torch.tensor(df[df['ID'] == 'serie1']['y'].tolist())
model.tokenizer.config.context_length = len(context)
embeddings, tokenizer_state = model.embed(context)
So, I find that 'embeddings.shape' gives me as result: torch.Size([1, 1942, 256]). That is, inside the serie 'serie1', I get a vector representation (256) for each point of the serie (1941+1), however, my goal is to achieve a single representation of the entire serie, not of each point of it.
Could someone help me? Thanks!
Beta Was this translation helpful? Give feedback.
All reactions