TensorFlow

SOL’s TensorFlow integration supports to translate Keras models into SOL models. You can also use methods from tf.math or tf.nn but they need to be embedded into the Keras model.

import tensorflow as tf
import sol
import keras

def AlexNet(input_shape=(224, 224, 3), format="channels_last"):
	inputs = keras.Input(shape=(input_shape))
	model = inputs
	model = keras.layers.Conv2D(input_shape=input_shape, filters=64, kernel_size=(11,11), strides=(4,4), padding='same', activation='relu', data_format=format)(model)
	model = keras.layers.MaxPooling2D(pool_size=3, strides=2, padding='valid', data_format=format)(model)
	model = keras.layers.Conv2D(filters=192, kernel_size=5, strides=1, padding='same', activation='relu', data_format=format)(model)
	model = keras.layers.MaxPooling2D(pool_size=3, strides=2, padding="valid", data_format=format)(model)
	model = keras.layers.Conv2D(filters=384, kernel_size=3, strides=1, padding="same", activation='relu', data_format=format)(model)
	model = keras.layers.Conv2D(filters=256, kernel_size=3, strides=1, padding="same", activation='relu', data_format=format)(model)
	model = keras.layers.Conv2D(filters=256, kernel_size=3, strides=1, padding="same", activation='relu', data_format=format)(model)
	model = keras.layers.MaxPooling2D(pool_size=3, strides=2, padding="valid", data_format=format)(model)
	model = keras.layers.Flatten(data_format=format)(model)
	model = keras.layers.Dropout(rate=0.5)(model)
	model = keras.layers.Dense(4096, input_shape=(256*6*6,), activation='relu')(model)
	model = keras.layers.Dropout(rate=0.5)(model)
	model = keras.layers.Dense(4096, activation="relu")(model)
	model = keras.layers.Dense(units=args.channels)(model)
	return keras.models.Model(inputs=inputs, outputs=model)

with tf.device('/CPU:0'):
	model = AlexNet()

	sol_model = sol.optimize(model, batch_size=1)
	sol_model.convert("x86") # See https://issue.sol-ai.org/289

	# Inference
	output = sol_model(inputs)

	# Training
	sol_model.compile(...)
	sol_model.fit(inputs, targets)

Tested Networks

  • Alexnet
  • SqueezeNet (1.0, 1.1)

Supported Layers

Please refer to https://www.tensorflow.org/api/stable for how these functions are used. This documentation only contains which layers, functions and tensor functionality is currently implemented within SOL.

Layers

Please see the following list of all supported operators.

  • tf
    • clip_by_value
    • constant
    • expand_dims
    • fill
    • identity
    • identity_n
    • ones
    • ones_like
    • range
    • reshape
    • split
    • squeeze
    • stack
    • unstack
    • where
    • zeros
    • zeros_like
  • tf.math
    • abs
    • accumulate_n
    • acos
    • acosh
    • add
    • add_n
    • argmax
    • argmin
    • asin
    • asinh
    • atan2
    • atan
    • atanh
    • ceil
    • cos
    • cosh
    • divide
    • equal
    • erf
    • erfc
    • exp
    • expm1
    • floor
    • floordiv
    • floormod
    • greater
    • greater_equal
    • is_finite
    • is_inf
    • is_nan
    • less
    • less_equal
    • log1p
    • log
    • log_sigmoid
    • logical_and
    • logical_not
    • logical_or
    • logical_xor
    • maximum
    • minimum
    • mod
    • multiply
    • negative
    • not_equal
    • pow
    • reciprocal
    • reduce_logsumexp
    • reduce_max
    • reduce_min
    • reduce_prod
    • reduce_sum
    • round
    • rsqrt
    • scalar_mul
    • sigmoid
    • sign
    • sin
    • sinh
    • softplus
    • sqrt
    • square
    • squared_difference
    • subtract
    • tan
    • tanh
    • truediv
    • xdivy
    • xlog1py
    • xlogy
  • tf.nn
    • avg_pool1d
    • avg_pool2d
    • avg_pool3d
    • avg_pool
    • dropout
    • elu
    • gelu
    • leaky_relu
    • max_pool1d
    • max_pool2d
    • max_pool3d
    • max_pool
    • max_pool_with_argmax
    • pool
    • relu6
    • relu
    • selu
    • silu
    • softmax
    • softplus
    • softsign
  • tf.keras.activations
    • elu
    • exponential
    • gelu
    • hard_sigmoid
    • linear
    • softmax
    • softplus
    • softsign
    • swish
    • tanh
  • tf.keras.nn
    • Activation('elu')
    • Activation('exponential')
    • Activation('gelu')
    • Activation('hard_sigmoid')
    • Activation('linear')
    • Activation('relu')
    • Activation('sigmoid')
    • Activation('softmax')
    • Activation('softplus')
    • Activation('softsign')
    • Activation('swish')
    • Activation('tanh')
    • Add
    • AlphaDropout
    • AveragePooling1D
    • AveragePooling2D
    • AveragePooling3D
    • BatchNormalization
    • Concatenate
    • Conv1D
    • Conv2D
    • Conv3D
    • Dense
    • DepthwiseConv2D
    • Dropout
    • ELU
    • Embedding
    • Flatten
    • GlobalAveragePooling1D
    • GlobalAveragePooling2D
    • GlobalAveragePooling3D
    • GlobalMaxPool1D
    • GlobalMaxPool2D
    • GlobalMaxPool3D
    • InputLayer
    • LayerNormalization
    • LeakyReLU
    • MaxPooling1D
    • MaxPooling2D
    • MaxPooling3D
    • Maximum
    • Minimum
    • PReLU
    • ReLU
    • Reshape
    • SeparableConv1D
    • SeparableConv2D
    • Softmax
    • Subtract
    • TFOpLambda
    • ThresholdedReLU
    • ZeroPadding1D
    • ZeroPadding2D
    • ZeroPadding3D