Kronos BTC 1hr Tokenizer
This is a fine-tuned Kronos tokenizer trained on Bitcoin 1-hour candlestick data.
Model Details
- Model Type: Kronos Tokenizer
- Training Data: Bitcoin 1-hour candlestick data
- Architecture: Transformer-based encoder-decoder
- Model Size: ~15MB
Configuration
{
"attn_dropout_p": 0.0,
"beta": 0.05,
"d_in": 6,
"d_model": 256,
"ff_dim": 512,
"ffn_dropout_p": 0.0,
"gamma": 1.1,
"gamma0": 1.0,
"group_size": 4,
"n_dec_layers": 4,
"n_enc_layers": 4,
"n_heads": 4,
"resid_dropout_p": 0.0,
"s1_bits": 10,
"s2_bits": 10,
"zeta": 0.05
}
Usage
This tokenizer is part of the Kronos financial prediction system. It should be used in conjunction with the corresponding basemodel.
Training Details
- Training Time: ~78 minutes
- Framework: PyTorch
Files
config.json: Tokenizer configurationmodel.safetensors: Tokenizer weightsREADME.md: This file
License
MIT License
- Downloads last month
- -
Inference Providers
NEW
This model isn't deployed by any Inference Provider.
๐
Ask for provider support