Skip to content

Loading the Model (and Complaining about Memory Usage)

How I loaded the files in python

I used joblib to save the files

and joblib to load the model to the memory

I used LemmaTokenizer to parse the text (it’s a stemmer).


Memory Usage

Turns out that the model  (300MB) when loaded into memory was around 800MB of RAM. Which meant that Heroku couldn’t host because the slug size is too big and the used RAM is way too big. Very frustrating. Which means that only way to host this would be on a server on AWS, Google Cloud or Digital Ocean.

I decided to wrap the model around a django website and walla, I’m done!односторонний открытый синус лифтингнаписать смс бесплатно теле2Данильченко Юрий Харьковcarp expertvitodens 100-wдемо счет бинарные опционы

Published inMachine Learning

Be First to Comment

Leave a Reply

Your email address will not be published. Required fields are marked *

This site uses Akismet to reduce spam. Learn how your comment data is processed.