Basic Image processing functions

Basic image processing techniques

In [ ]:
import numpy as np
from PIL import Image
import imageio
import skimage
import matplotlib.pyplot as plt
from skimage.transform import rotate,resize
  1. read image from path

  2. convert it to grayscale image

  3. plot image

  4. write image to a png file

In [ ]:
path = 'https://upload.wikimedia.org/wikipedia/en/thumb/7/7d/Lenna_%28test_image%29.png/440px-Lenna_%28test_image%29.png'

plot histogram of gray scaled image

In [ ]:

rotate image 60 degrees

In [ ]:

resize images to 600x600 and show both images

In [ ]:

How many coins are inside image? Use skimage's measure module :)

In [ ]:
from skimage import measure

coins_path = 'https://media.imna.ir/d/2018/09/29/3/1530690.jpg'
np_image = imageio.imread(coins_path)
# your code

use deeplake module to download required files for the following tasks

In [ ]:
! pip install deeplake
In [ ]:
import deeplake
ds = deeplake.load("hub://activeloop/cifar100-test")
  1. read 10000 images of this dataset randomly from test images and save them as png files using a for loop. measure time needed to read and save them.
  2. Use python multithreading to read and save these files and measure the process time.
  3. Use python multiprocessing to read and save these files and measure the process time.
In [ ]:
from typing import Callable
from datetime import datetime
from PIL import Image
import os

saving_address = "~/images_cifar100"
os.makedirs(saving_address , exist_ok=True)
def timeit(function:Callable):
    first_time = datetime.now()
    function()
    last_time = datetime.now()
    print(f"process took {(last_time - first_time).total_seconds()} seconds.")
    return
def save_image(index):
    image = ds.images[index].numpy()
    Image.fromarray(image).save(f"{saving_address}/{index}.png")
In [ ]:
import random
def save_loop():
    # your code here!
    return
timeit(save_loop)
process took 0.949402 seconds.
In [ ]:
import random
from multiprocessing import Pool

def save_multiprocessing():
    # your code here!
    return
timeit(save_multiprocessing)
process took 0.258296 seconds.
In [ ]:
import random
from multiprocessing.pool import ThreadPool
def save_multithreading():
    # your code here!
    return
timeit(save_multithreading)
process took 0.990368 seconds.
  1. read 600 images of this dataset randomly from test images and blur them two times with kernel sizes (3,3) , (5,5) using a for loop. measure time needed to read and save them.
  2. Use python multithreading to read and process these images and measure the process time.
  3. Use python multiprocessing to read and process these images and measure the process time.
In [ ]:
import cv2
In [ ]:
import random
def save_loop():
    # your code here!
    return
timeit(save_loop)
process took 41.299255 seconds.
In [ ]:
import random
from multiprocessing import Pool

def save_multiprocessing():
    # your code here!
    return
timeit(save_multiprocessing)
process took 16.460644 seconds.
In [ ]:
import random
from multiprocessing.pool import ThreadPool
def save_multithreading():
    # your code here!
    return
timeit(save_multithreading)
process took 17.336391 seconds.

Where should we use multithreading? What about multiprocessing?