Rumi-to-Jawi

This tutorial is available as an IPython notebook at Malaya/example/rumi-jawi.

This module trained on both standard and local (included social media) language structures, so it is save to use for both.

Explanation

Originally from https://www.ejawi.net/converterV2.php?go=rumi able to convert Rumi to Jawi using heuristic method. So Malaya convert from heuristic and map it using deep learning model.

comel -> چوميل

[1]:
%%time
import malaya
CPU times: user 6.81 s, sys: 1.42 s, total: 8.23 s
Wall time: 10.1 s

Use deep learning model

Load LSTM + Bahdanau Attention Rumi to Jawi model.

If you are using Tensorflow 2, make sure Tensorflow Addons already installed,

pip install tensorflow-addons U
def deep_model(quantized: bool = False, **kwargs):
    """
    Load LSTM + Bahdanau Attention Rumi to Jawi model.
    Original size 11MB, quantized size 2.92MB .
    CER on test set: 0.014847105998349451
    WER on test set: 0.06737832963079593

    Parameters
    ----------
    quantized : bool, optional (default=False)
        if True, will load 8-bit quantized model.
        Quantized model not necessary faster, totally depends on the machine.

    Returns
    -------
    result: malaya.model.tf.Seq2SeqLSTM class
    """
[2]:
model = malaya.rumi_jawi.deep_model()

Load Quantized model

To load 8-bit quantized model, simply pass quantized = True, default is False.

We can expect slightly accuracy drop from quantized model, and not necessary faster than normal 32-bit float model, totally depends on machine.

[6]:
quantized_model = malaya.rumi_jawi.deep_model(quantized = True)
Load quantized model will cause accuracy drop.

Predict

def predict(self, strings: List[str], beam_search: bool = False):
    """
    Convert to target string.

    Parameters
    ----------
    strings : List[str]
    beam_search : bool, (optional=False)
        If True, use beam search decoder, else use greedy decoder.

    Returns
    -------
    result: List[str]
    """

If want to speed up the inference, set beam_search = False.

[13]:
model.predict(['saya suka makan ayam', 'ayaq acaq kotoq', 'esok birthday saya, jgn lupa bawak hadiah'])
[13]:
['ساي سوك ماكن ايم', 'اياق اچق كوتوق', 'ايسوق بيرثداي ساي، جڬن لوڤا باوق هديه']
[14]:
quantized_model.predict(['saya suka makan ayam', 'ayaq acaq kotoq', 'esok birthday saya, jgn lupa bawak hadiah'])
[14]:
['ساي سوك ماكن ايم', 'اياق اچق كوتوق', 'ايسوق بيرثداي ساي، جڬن لوڤا باوق هديه']
[ ]: