Portal Log 2019.15.06 K-Priest Korba

The Key is a hierarchy of relationships that produce and maintain itself.1 Despite the best efforts of our K-sorcerers, a true Portal Key remains beyond our grasp. In its stead we deployed a facsimile on AWS, weighted edges that produce a face and nothing more.

Extracting Pickle Essence

Rationalmancer Gwern distilled the blood of 2.5 TB of anime girls2 into a single Pickle.3 Extract a SavedModel from it:

import pickle
import dnnlib.tflib as tflib
import tensorflow as tf

EXPORT_PATH = "export/Servo/1"

tflib.init_tf()
_G, _D, Gs = pickle.load(open("2019-03-08-stylegan-animefaces-network-02051-021980.pkl", "rb"))

sess = tf.get_default_session()

tf.saved_model.simple_save(
    sess,
    EXPORT_PATH,
    inputs={t.name:t for t in Gs.input_templates},
    outputs={t.name:t for t in Gs.output_templates})

tar the output and upload it to S3.

tar -zcvf model.tar.gz export

Deploy on Sagemaker

Beg the local Bezos drone for additional ml.p2.xlarge quota. Inscribe the following in a Sagemaker Jupyter instance:

from sagemaker.tensorflow.serving import Model
from sagemaker import get_execution_role

model = Model(model_data='s3://S3_BUCKET/model.tar.gz', role=get_execution_role())

predictor = model.deploy(initial_instance_count=1, instance_type='ml.p2.xlarge', endpoint_name='stylegan')

λ API

Create a new Lambda on AWS. The Lambda’s environment must be constructed with PIL and numpy.4 Zip in the following lambda_function.py and upload.

import os
import boto3
import json
import numpy as np
import PIL.Image
import base64
import io
import binascii

# grab environment variables
ENDPOINT_NAME = os.environ['ENDPOINT_NAME']
runtime = boto3.client('runtime.sagemaker')
GAN_LEN = 512
Z_LEN = 8

def lambda_handler(event, context):

    if 'queryStringParameters' in event:
        str_z = "0,0,0,0,0,0,0,0"
        if 'z' in event['queryStringParameters']:
            str_z = event['queryStringParameters']['z']

        seed = "31337"
        if 'seed' in event['queryStringParameters']:
            seed = event['queryStringParameters']['seed']

        z = np.array(list(map(float, str_z.split(","))))
        z = (z - 50) / 25    # 0,100 -> -2,2

        num_dups = (GAN_LEN // Z_LEN) + 1
        z = np.repeat(z, num_dups)  # Extend input-z to GAN embedding size

        # Initialize np.random with input seed
        rnd = np.random.RandomState(int(binascii.crc32(seed.encode()))) 

        # Sample and mix input
        latents = rnd.randn(GAN_LEN) / 2
        latents += z[0:512]
        latents = np.expand_dims(latents, axis=0)

        body = {"inputs": {}}
        body["inputs"]["Gs/labels_in:0"] = []
        body["inputs"]["Gs/latents_in:0"] = latents.tolist()

        response = runtime.invoke_endpoint(EndpointName=ENDPOINT_NAME,
                                           ContentType='application/json',
                                           Body=json.dumps(body))

        result = json.loads(response['Body'].read().decode())
        arr = np.asarray(result['outputs']).squeeze().transpose(1, 2, 0) + 1
        arr = np.clip(arr * (255 / 2), 0, 255).astype('uint8')
        img = PIL.Image.fromarray(arr, 'RGB')

        buffer = io.BytesIO()
        img.save(buffer, format='JPEG')
        buffer.seek(0)

        data_uri = base64.b64encode(buffer.read()).decode('utf-8')

        return {'isBase64Encoded': True,
                'statusCode': 200,
                'headers': {'Content-Type': "image/jpg", 'Access-Control-Allow-Origin': '*'},
                'body': data_uri}

    return ""

Finally, export the function to the ‘net through API Gateway.