This is a series that gives you solid steps and understandings from gathering data to deploy the model in production🚀.
Hey everyone, Shubhamai here, I am a Machine Learning Engineer. I have skills including ML, DL, Computer Vision, Natual Language Procession, and much more…
What’s the project?
So the project we are building is to take a plant’s leaf🌿 from the user and predict is the leaf is healthy or not😷. Using Tensorflow 2.0. Let’s get started
In this part we will discuss things as follows:
- Installing Dependencies & Streamlit Intro
- Getting the Trained Model
- Making the Frontend
- Importing Model Code into Frontend
- Choosing our Deployment Service
- Create a Heroku Project and Command
- Deploying out the website!
In this section, we will download some necessary dependencies and libraries
So here is what my requirements.txt looks like
I mainly contain Streamlit (which we will talk about soon), Tensorflow 2.0.0, and some small utility libraries,
In very little words Streamlit is a python framework that allows you to build custom ML tools. It’s a completely free and Open Souce Tool**. **It let you make a react web app, include your ML models in them, completely in python.
Now I think it seems some confusion, let’s go to some demo.
Install Streamlit with
pip install streamlit
The Amazing part of Streamlit that I love is that it’s so SIMPLE, and Yet So Powerful. Let you make React Web App, with 100% Python.
Getting the Trained Model
In this section, we will import out trained model from the last part, and if we need model architecture or model weights
Now if you remember part 1, we export our model architecture and weights into a .h5 extension file, this is the time when we need it!
We will choose model weights because our model with full architecture is ~ 350MB whereas the model weight is ~ 100MB. We have limited memory in the cloud so we have to choose Model Weights ( We will make architecture in code)
Making the Frontend
In this section, we will make a beautiful and responsive React Web App . All in Python.
streamlit run app.py
With such a short amount of code, it’s looks something like this
Importing Our Code into Frontend
In this section we will complete building our application, importing the model weights and making the architecture.
✔ Importing Libraries
✔ Importing the model
Now you are thinking what is this decorator
Streamlit provides a caching mechanism that allows your app to stay performant even when loading data from the web, manipulating large datasets, or performing expensive computations. This is done with the @st.cache decorator. — Streamlit Docs
In my case, this decorator allows to load a very big model for a single time, it saves on your computer and then, every time you go to my website, it runs straight away
allow_output_mutation (boolean) — Streamlit normally shows a warning when return values are not mutated, as that can have unintended consequences. This is done by hashing the return value internally.If you know what you’re doing and would like to override this warning, set this to True. — Streamlit Docs
I also don’t know what that is 😛, but my app was not working, and when I set it to true in the app. Everything goes well.
We load the model and different architectures and then we set out pre-trained weights from the last part by
How cool is that, Ahh, let’s get forward
- Then we do some more tweaking in our code ….. and then this is final code
Choosing our Deployment Services
In this section, we will choose out deployment services, set up an account, and setting up out project.
There are so many ways to deploy our Machine Learning Model and there are soo many deployments, like —
Now Heroku and Netlify not for made for ML Models deployment , but for hobby and micro scale, it works pretty well.
In our project, we will choose Heroku, because —
It’s soo easy to deploy.
It allows the developer to think about the code, not infrastructures.
Allows all types for modern technologies to deploy (like react, flask).
So make an account, if you have already, let’s talk about the next section
Create a Heroku Project and Command
In this section, we will make Heroku project for this, and setting up some commands and requirements to give some instructions to the server, what has to be done.
Ahh, also guys, you need a Heroku Command-Line for that, so download it and I am here waiting for you 😏.
I hope you have downloaded it, so the first thing looks at my project directory structure.
> # Procfile > Heroku apps include a **Procfile** that specifies the commands that are executed by the app on startup. — [Heroku Docs](https://devcenter.heroku.com/articles/procfile)
web: sh setup.sh && streamlit run app.py
This is how my Procfile looks like —
It firsts take the setup.sh file (we will talk about this soon) and then it runs run app, same as we run in our local Machine.
mkdir -p ~/.streamlit/ echo "\ [general]\n\ email = \"<your email>\"\n\ " > ~/.streamlit/credentials.toml echo "\ [server]\n\ headless = true\n\ enableCORS=false\n\ port = $PORT\n\ " > ~/.streamlit/config.toml
Now, to be honest, I also don’t know what setup.sh does, but simply, I set’s a PORT for our server, and some credentials, allowing CORS, stuff like that.
Now, this file is pretty important when you run this command.
pip freeze > requirements.txt
It will make a requirements.txt file that has all the dependencies and libraries that we use in our code such as Streamlit, Tensorflow, etc..
Follow the steps—
git init git stage . git commit -m"Project Completed" heroku create git remote add origin <your heroku git link (you will get this from previus command)>
Deploying out the website!
In this section, we will run a single command and magic will happen, your site will be deployed!
RUN! (Don’t run 🏃♀️, just run this command).
git push origin master
This command will take some time, as the server is installing dependencies and running the Procfile, but at the last moment, you will get a link to your deployed site!
If you will go to your Heroku Website Dashboard, you will see your app there.
Now you just learned —
From Getting Dataset to Model in Production🚀
I am so happy, I hope you also, also you can get all code of this project from here.
Hey, I hope you like this post. Let me know what you are interested to see in the next post, or any correction or improvement you want to see. Share this with your friends. I am Shubhamai and I will see you in the next post.
— — — — — — — — — — — — — — — — — — — — — — — — — — — — — — —