Tips for deploying a Python Google Cloud Function

Richard P
3 min readOct 13, 2021

Some quirks of Cloud Functions.

Photo by Felix Mittermeier on Unsplash

The python script was a relatively simple ETL job that tested my abilities to architect and debug, but it was a great experience in deploying production code under constraints. The script simply compared two Google Sheets, and updated the target page with new or updated landing values. The script fires from a Cloud Scheduler message to PubSub, which triggers the python function to execute.

Here are my tips!

Look at how many Pandas DataFrames you are creating and eliminate unnecessary objects.

If your data set is large, this can be extremely wasteful. This is where a data scientist needs to put on a computer scientist hat and pipe transformations or find optimal ways to create and dispose of objects in memory.

One simple way to check the sizes of DataFrames in memory is this function:

Reduce package imports to conserve memory

Your memory consumption may surprise you when you try to launch your container and it crashes for lack of memory to import the packages. Start work in a clean environment and clean up your requirements.txt file before deploying. Here’s the error we saw until when trying to deploy until we bumped up the container memory.

Function invocation was interrupted. Error: memory limit exceeded.

Check the recommended version of your package for your runtime

This turned out to be a major stumbling block when trying to set up the testing environment. Installing the recommended version of each package resolved the issues. Here is the list of package version recommendations for Python 3.9.

--

--