9. Miscellaneous learnings¶
9.2. Running bash in ipynb¶
## copy a file
# !cp myscript.py myscript2.py
## run timeit on python
!python -m timeit -r 20 '"-".join(str(n) for n in range(100))'
## r does repetition
## The output suggests there were 20000 loops, repeated 20
## times for accuracy, and
# "-".join(str(n) for n in range(100))
## install python modules
# !pip install pstats
9.3. Save a file dynamically into a .py or .R file from jupyter notebook (Magic Save)¶
# %save -a example.py -n 28 ## change 28 to respective line number
9.4. Convert .ipynb to .md file¶
Here FORMAT could be Markdown
, html
, pdf
, tex
, slides
Note
Keep in mind that this does not convert to MyST Markdown
# jupyter nbconvert --to FORMAT notebook.ipynb
9.5. Guppy & Objgraph (memory profiling misc.)¶
There are number of ways to profile an entire Python application. Most python way is Guppy. You can take a snapshot of the heap before and after a critical process. Then compare the total memory and pinpoint possible memory spikes involved within common objects. Well, but apparently it works with Python 2.
# !pip3 install objgraph
import objgraph
objgraph.show_most_common_types() ## overview of the objects in memory
class MyBigFatObject(object):
pass
def computate_something(_cache={}):
_cache[42] = dict(foo=MyBigFatObject(),
bar=MyBigFatObject())
# a very explicit and easy-to-find "leak" but oh well
x = MyBigFatObject() # this one doesn't leak
objgraph.show_growth(limit=3)
computate_something()
objgraph.show_growth()
It’s easy to see MyBigFatObject
instances that appeared and were not freed. So,
we can trace the reference chain back.