A PHP implementation of OpenAI's BPE tokenizer tiktoken.
-
Updated
Jan 25, 2025 - PHP
A PHP implementation of OpenAI's BPE tokenizer tiktoken.
Byte-Pair Encoding tokenizer for training large language models on huge datasets
(1) Train large language models to help people with automatic essay scoring. (2) Extract essay features and train new tokenizer to build tree models for score prediction.
processing de LANguage NATural
The basic BPE-tokenizer in NLP
Transformer Models for Humorous Text Generation. Fine-tuned on Russian jokes dataset with ALiBi, RoPE, GQA, and SwiGLU.Plus a custom Byte-level BPE tokenizer.
A modular Byte Pair Encoding (BPE) tokenizer implemented in Python
Naive implementation of byte pair encoding (BPE) tokenization algorithm in golang
This project implements a Byte Pair Encoding (BPE) tokenization approach along with a Word2Vec model to generate word embeddings from a text corpus. The implementation leverages Apache Hadoop for distributed processing and includes evaluation metrics for optimal dimensionality of embeddings.
self made byte-pair-encoding tokenizer
Add a description, image, and links to the bpe-tokenizer topic page so that developers can more easily learn about it.
To associate your repository with the bpe-tokenizer topic, visit your repo's landing page and select "manage topics."