TensorFlow

SOL’s TensorFlow integration supports to translate tf.Function, tf.Module, Keras and tf.saved_model models into SOL models. If your tf.saved_model has multiple signatures, you need to select the preferred one using sol.optimize(my_saved_model.signatures['my_signature']). By default SOL uses the tf.saved_model.__call__ function.

import tensorflow as tf
import sol
import tensorflow.keras as keras

def AlexNet(input_shape=(224, 224, 3), format="channels_last"):
	inputs = keras.Input(shape=(input_shape))
	x = inputs
	x = keras.layers.Conv2D			(input_shape=input_shape, filters=64, kernel_size=(11,11), strides=(4,4), padding='same', activation='relu', data_format=format)(x)
	x = keras.layers.MaxPooling2D	(pool_size=3, strides=2, padding='valid', data_format=format)(x)
	x = keras.layers.Conv2D			(filters=192, kernel_size=5, strides=1, padding='same', activation='relu', data_format=format)(x)
	x = keras.layers.MaxPooling2D	(pool_size=3, strides=2, padding="valid", data_format=format)(x)
	x = keras.layers.Conv2D			(filters=384, kernel_size=3, strides=1, padding="same", activation='relu', data_format=format)(x)
	x = keras.layers.Conv2D			(filters=256, kernel_size=3, strides=1, padding="same", activation='relu', data_format=format)(x)
	x = keras.layers.Conv2D			(filters=256, kernel_size=3, strides=1, padding="same", activation='relu', data_format=format)(x)
	x = keras.layers.MaxPooling2D	(pool_size=3, strides=2, padding="valid", data_format=format)(x)
	x = keras.layers.Flatten		(data_format=format)(x)
	x = keras.layers.Dropout		(rate=0.5)(x)
	x = keras.layers.Dense			(4096, input_shape=(256*6*6,), activation='relu')(x)
	x = keras.layers.Dropout		(rate=0.5)(x)
	x = keras.layers.Dense			(4096, activation="relu")(x)
	x = keras.layers.Dense			(1000)(x)
	return keras.models.Model		(inputs=inputs, outputs=x)

@tf.function(input_signature=[tf.TensorSpec([None, 224, 224, 3], dtype=tf.float32)])
def tf_function(input):
	return ...

class TFModule(tf.Module):
	def init(self):
		super().__init__()
		self.var = tf.Variable(...)
		
	@tf.function(input_signature=[tf.TensorSpec([None, 224, 224, 3], dtype=tf.float32)])
	def __call__(self, input):
		return ...

with tf.device('/CPU:0'):
	sol_model = sol.optimize(AlexNet(), batch_size=1)
	# or
	sol_model = sol.optimize(tf_function)
	# or
	sol_model = sol.optimize(TFModule())
	# or
	sol_model = sol.optimize(tf.saved_model.load("/path/to/saved/model"))

	# Inference
	output = sol_model(inputs)

	# Training for Keras Models
	sol_model.compile(...)
	sol_model.fit(inputs, targets)

	# Training for tf.Function and tf.Module
	# TODO:

F.A.Q.

How do I store/load a Tensorflow Keras model?

SOL model's cannot be stored directly. For storing/loading a SOL Keras model, use model.save_weights(...) and model.load_weights(...) methods.

# Storing
sol_model = sol.optimize(keras_model)
sol_model.save_weights(checkpoint_path)

# Loading
sol_model = sol.optimize(keras_model)
sol_model.load_weights(checkpoint_path)

More information on loading/storing the weights can be found here

Supported Layers

Please refer to https://www.tensorflow.org/api/stable for how these functions are used. This documentation only contains which layers, functions and tensor functionality is currently implemented within SOL.

Layers

  • Abs
  • Acos
  • Acosh
  • AddN
  • AddV2
  • ArgMax
  • ArgMin
  • Asin
  • Asinh
  • AssignVariableOp
  • Atan2
  • Atan
  • Atanh
  • AvgPool3D
  • AvgPool
  • BiasAdd
  • Cast
  • Ceil
  • ConcatV2
  • Const
  • Conv1D
  • Conv2D
  • Conv3D
  • Cos
  • Cosh
  • Cumsum
  • DepthwiseConv2dNative
  • Elu
  • Equal
  • Erf
  • Erfc
  • Exp
  • ExpandDims
  • Expm1
  • Fill
  • Floor
  • FloorDiv
  • FloorMod
  • FusedBatchNormV3
  • GatherV2
  • Greater
  • GreaterEqual
  • Identity
  • IdentityN
  • IsFinite
  • IsInf
  • IsNan
  • LeakyRelu
  • Less
  • LessEqual
  • Log1p
  • Log
  • LogicalAnd
  • LogicalNot
  • LogicalOr
  • MatMul
  • Max
  • MaxPool3D
  • MaxPool
  • MaxPoolWithArgmax
  • Maximum
  • Mean
  • Min
  • Minimum
  • Mul
  • Neg
  • NoOp
  • NotEqual
  • Pack
  • Pad
  • PartitionedCall
  • Placeholder
  • Pow
  • Prod
  • RandomUniform
  • Range
  • ReadVariableOp
  • RealDiv
  • Reciprocal
  • Relu6
  • Relu
  • Reshape
  • ResizeArea
  • ResizeBicubic
  • ResizeBilinear
  • ResizeNearestNeighbor
  • Round
  • Rsqrt
  • Select
  • SelectV2
  • Selu
  • Shape
  • Sigmoid
  • Sign
  • Sin
  • Sinh
  • Softmax
  • Softplus
  • Softsign
  • Split
  • SplitV
  • Sqrt
  • Square
  • SquaredDifference
  • Squeeze
  • StatefulPartitionedCall
  • StopGradient
  • StridedSlice
  • Sub
  • Sum
  • Tan
  • Tanh
  • Tile
  • Transpose
  • Unpack
  • Where
  • Xdivy
  • Xlog1py
  • Xlogy
  • ZerosLike

Tested Models

  • AlexNet
  • SqueezeNet [1.0, 1.1]
  • YOLO