AFP via Getty Images
This story was originally featured on Fortune.com
。关于这个话题,新收录的资料提供了深入分析
Tokenizer EfficiencyThe Sarvam tokenizer is optimized for efficient tokenization across all 22 scheduled Indian languages, spanning 12 different scripts, directly reducing the cost and latency of serving in Indian languages. It outperforms other open-source tokenizers in encoding Indic text efficiently, as measured by the fertility score, which is the average number of tokens required to represent a word. It is significantly more efficient for low-resource languages such as Odia, Santali, and Manipuri (Meitei) compared to other tokenizers. The chart below shows the average fertility of various tokenizers across English and all 22 scheduled languages.,详情可参考新收录的资料
Kuwaiti F/A-18’s Triple Friendly Fire Shootdown Gets Stranger By The Day