Tensorflow 2.0 Alpha Could not find any TPU devices
I've created a custom model with Tensorflow 2.0 Alpha and wanted to use a distributed training.
resolver = tf.distribute.cluster_resolver.TPUClusterResolver()
tf.tpu.experimental.initialize_tpu_system(resolver)
tpu_strategy = tf.distribute.experimental.TPUStrategy(resolver)
By using tensorflow strategies, I'm getting this error
/usr/local/lib/python3.6/dist-packages/tensorflow/python/tpu/tpu_strategy_util.py in get_first_tpu_host_device(cluster_resolver)
41 [x for x in context.list_devices() if "device:TPU:" in x])
42 if not tpu_devices:
43 raise RuntimeError("Could not find any TPU devices")
44 spec = tf_device.DeviceSpec.from_string(tpu_devices[0])
45 task_id = spec.task
RuntimeError: Could not find any TPU devices
I'm using Ubuntu 16.04 and installed Tensorflow 2.0 by pip