Before learning the depth of knowledge learning, we must first understand some basic knowledge of the famous framework TensorFlow inside, the following first look at some basic usage of this framework.

import tensorflow as tf a = 3 # Python中普通的变量创建方式 # Create a variable. w = tf.Variable([[0.5, 1.0]]) # tensorflow创建变量方式 x = tf.Variable([[2.0], [1.0]]) y = tf.matmul(w, x) # 矩阵内积 变量的操作 print(y) # tensor 里面没有具体的值 # variables have to be explicitly initialized before you can run Ops # 初始化全局变量 w，x，y init_op = tf.global_variables_initializer() # 计算图 with tf.Session() as sess: sess.run(init_op) print(y.eval()) # 通过这种方式打印具体的值

The result is:

```
Tensor("MatMul_2:0", shape=(1, 1), dtype=float32)
[[2.]]
```

As can be seen from the above, simply a matrix multiplication, we write so much code, it seems too much trouble, but there is no way to use this framework must be in accordance with its usage to use, but with this framework depth learning to write code inside, it is not very complicated. The above code shows the basic usage TensorFlow frame import library, variable definition, initialization variables, Session operation, before the specific operation.

TensorFlow below to learn about some uses of the framework functions, and can numpy library functions that contrast with the study.

from numpy import int32 # float32 在TensorFlow最好使用这种格式 # 创建都是0的矩阵 tf.zeros([3, 4], int32) # ==> [[0, 0, 0, 0], [0, 0, 0, 0], [0, 0, 0, 0]] tensor = tf.constant([[1, 2, 3], [4, 5, 6]]) # 矩阵格式相似 tf.zeros_like(tensor) # ==> [[0, 0, 0], [0, 0, 0]] # 矩阵元素都为1 tf.ones([2, 3], int32) # ==> [[1, 1, 1], [1, 1, 1]] tf.ones_like(tensor) # ==> [[1, 1, 1], [1, 1, 1]] # Constant 1-D Tensor populated with value list. # 创建一个常量，必须使用这种方式 tensor = tf.constant([1, 2, 3, 4, 5, 6, 7]) # => [1 2 3 4 5 6 7] # Constant 2-D tensor populated with scalar value -1. # 创建二维矩阵常量 tensor = tf.constant(-1.0, shape=[2, 3]) # => [[-1. -1. -1.] # [-1. -1. -1.]] # 创建间隔矩阵 tf.linspace(10.0, 12.0, 3, name="linspace") # => [ 10.0 11.0 12.0] # 'start' is 3 # 'limit' is 18 # 'delta' is 3 # tf.range(start, limit, delta) tf.range(3, 18, 3)# ==> [3, 6, 9, 12, 15]

TensorFlow can be seen inside some of the functions and usage of almost numpy inside, look at some of the following usage TensorFlow random numbers.

# 高斯分布的均值矩阵 指定均值和方差 norm = tf.random_normal([2, 3], mean=-1, stddev=4) # Shuffle the first dimension of a tensor c = tf.constant([[1, 2], [3, 4], [5, 6]]) # shuffle操作 shuff = tf.random_shuffle(c) # Each time we run these ops, different results are generated # 要执行这些操作的方法。推荐使用上面With结构 sess = tf.Session() print(sess.run(norm)) print(sess.run(shuff))

The results obtained are running

```
[[-2.4004993 5.356218 0.51297414]
[-4.353016 2.234075 -4.2948236 ]]
[[1 2]
[3 4]
[5 6]]
```

Let’s look at a completion of the printing using TensorFlow number between 0 to 4 of such a small chestnut, very simple raw Python mainly to see usage in TensorFlow.

# 打印0到4之间的的值 state = tf.Variable(0) # 初始化常量0 new_value = tf.add(state, tf.constant(1)) # 执行加1操作 update = tf.assign(state, new_value) # 将new_value赋给state # Session计算块 with tf.Session() as sess: sess.run(tf.global_variables_initializer()) print(sess.run(state)) for _ in range(3): sess.run(update) print(sess.run(state))

The result is

```
0
1
2
3
```

Let’s look at the following when you create a variable to convert numpy inside the format tensor format, but this method is not recommended

import numpy as np a = np.zeros((3,3)) # 将numpy里面的格式转换为tensor格式，并不推荐使用这种方法 # 推荐使用上面创建变量的方法 ta = tf.convert_to_tensor(a) with tf.Session() as sess: print(sess.run(ta))

The result is

```
[[0. 0. 0.]
[0. 0. 0.]
[0. 0. 0.]]
```

The following look at TensorFlow placeholder usage

# 创建占位符，用的时候再具体赋值。 input1 = tf.placeholder(tf.float32) input2 = tf.placeholder(tf.float32) output = tf.multiply(input1, input2) # 矩阵元素相乘 with tf.Session() as sess: print(sess.run([output], feed_dict={input1:[7.], input2:[2.]}))

The result is

[array([14.], dtype=float32)]

Summary, this blog contains TensorFlow framework some common usage, but certainly did not write the whole lot of details, but wrote some rough usage reserved for future viewing.