HuggingFace Repository¶
This tutorial is available as an IPython notebook at Malaya/example/huggingface-repository.
Starting Malaya 4.7.4, you can load Malaya models from https://huggingface.co/huseinzol05 to get better download speed.
Starting Malaya 4.7.5, by default Malaya will use HuggingFace as backend repository.
[1]:
import os
os.environ['MALAYA_USE_HUGGINGFACE'] = 'false'
[2]:
import malaya
[3]:
malaya.__version__
[3]:
'4.7.5'
Load model from Backblaze¶
If you found some models not exist in HuggingFace, you can try to use BackBlaze as alternative,
First, set global
MALAYA_USE_HUGGINGFACE
=false
,
os.environ['MALAYA_USE_HUGGINGFACE'] = 'false'
import malaya
Simply pass
use_huggingface=False
in load model function parameter, for an example,
malaya.emotion.transformer(model = 'bert', use_huggingface = False)
[6]:
malaya.emotion.transformer(model = 'albert', use_huggingface = False)
101%|██████████| 49.0/48.6 [00:07<00:00, 6.37MB/s]
184%|██████████| 1.00/0.54 [00:01<00:00, 1.37s/MB]
135%|██████████| 1.00/0.74 [00:01<00:00, 1.22s/MB]
[6]:
<malaya.model.bert.MulticlassBERT at 0x16647ce90>