Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

improve TFPark readme #1684

Merged
merged 3 commits into from
Oct 12, 2019
Merged
Changes from all commits
Commits
File filter

Filter by extension

Filter by extension

Conversations
Failed to load comments.
Loading
Jump to
Jump to file
Failed to load files.
Loading
Diff view
Diff view
35 changes: 19 additions & 16 deletions pyzoo/zoo/examples/tensorflow/tfpark/README.md
Original file line number Diff line number Diff line change
Expand Up @@ -25,12 +25,14 @@ export PYTHONPATH=$PWD/models/research/slim:$PYTHONPATH
Using TFDataset as data input

```bash
python keras_dataset.py
export MASTER=local[4]
python keras/keras_dataset.py
```

Using numpy.ndarray as data input
```bash
python keras_ndarray.py
export MASTER=local[4]
python keras/keras_ndarray.py
```

## Run the KerasModel example with prebuilt package
Expand All @@ -41,21 +43,22 @@ Using TFDataset as data input
export ANALYTICS_ZOO_HOME=... # the directory where you extract the downloaded Analytics Zoo zip package
export SPARK_HOME=... # the root directory of Spark

sh $ANALYTICS_ZOO_HOME/bin/spark-submit-python-with-zoo.sh --master local[4] keras/keras_dataset.py
bash $ANALYTICS_ZOO_HOME/bin/spark-submit-python-with-zoo.sh --master local[4] keras/keras_dataset.py
```

Using numpy.ndarray as data input
```bash
export ANALYTICS_ZOO_HOME=... # the directory where you extract the downloaded Analytics Zoo zip package
export SPARK_HOME=... # the root directory of Spark

sh $ANALYTICS_ZOO_HOME/bin/spark-submit-python-with-zoo.sh --master local[4] keras/keras_ndarray.py
bash $ANALYTICS_ZOO_HOME/bin/spark-submit-python-with-zoo.sh --master local[4] keras/keras_ndarray.py
```

## Run the TFEstimator example after pip install

Using TFDataset as data input
```bash
export MASTER=local[4]
python estimator/estimator_dataset.py
```

Expand All @@ -65,17 +68,17 @@ Using FeatureSet as data input
# the directory to the training data, the sub-directory of IMAGE_PATH should be
# different classes each containing the images of that class.
# e.g.
# IMAGE_PATH=/cat_dog
# IMAGE_PATH=file:///cat_dog
# NUM_CLASSES=2
# /cat_dog
# /cats
# cat.001.jpg
# /dogs
# dog.001.jpg
IMAGE_PATH=...
IMAGE_PATH=... # file://... for local files and hdfs:// for hdfs files
NUM_CLASSES=..


export MASTER=local[4]
python estimator/estimator_inception.py --image-path $IMAGE_PATH --num-classes $NUM_CLASSES
```

Expand All @@ -86,7 +89,7 @@ Using TFDataset as data input
export ANALYTICS_ZOO_HOME=... # the directory where you extract the downloaded Analytics Zoo zip package
export SPARK_HOME=... # the root directory of Spark

sh $ANALYTICS_ZOO_HOME/bin/spark-submit-python-with-zoo.sh --master local[4] estimator/estimator_dataset.py
bash $ANALYTICS_ZOO_HOME/bin/spark-submit-python-with-zoo.sh --master local[4] estimator/estimator_dataset.py
```

Using FeatureSet as data input
Expand All @@ -99,18 +102,18 @@ export SPARK_HOME=... # the root directory of Spark
# the directory to the training data, the sub-directory of IMAGE_PATH should be
# different classes each containing the images of that class.
# e.g.
# IMAGE_PATH=/cat_dog
# IMAGE_PATH=file:///cat_dog
# NUM_CLASSES=2
# /cat_dog
# /cats
# cat.001.jpg
# /dogs
# dog.001.jpg
IMAGE_PATH=...
IMAGE_PATH=... # file://... for local files and hdfs:// for hdfs files
NUM_CLASSES=..


sh $ANALYTICS_ZOO_HOME/bin/spark-submit-python-with-zoo.sh --master local[4] estimator/estimator_inception.py --image-path $IMAGE_PATH --num-classes $NUM_CLASSES
bash $ANALYTICS_ZOO_HOME/bin/spark-submit-python-with-zoo.sh --master local[4] estimator/estimator_inception.py --image-path $IMAGE_PATH --num-classes $NUM_CLASSES
```

## Run the Training Example using TFOptimizer after pip install
Expand All @@ -131,7 +134,7 @@ python tf_optimizer/train_mnist_keras.py
export ANALYTICS_ZOO_HOME=... # the directory where you extract the downloaded Analytics Zoo zip package
export SPARK_HOME=... # the root directory of Spark

sh $ANALYTICS_ZOO_HOME/bin/spark-submit-python-with-zoo.sh --master local[4] tf_optimizer/train_lenet.py
bash $ANALYTICS_ZOO_HOME/bin/spark-submit-python-with-zoo.sh --master local[4] tf_optimizer/train_lenet.py
```

For Keras users:
Expand All @@ -140,7 +143,7 @@ For Keras users:
export ANALYTICS_ZOO_HOME=... # the directory where you extract the downloaded Analytics Zoo zip package
export SPARK_HOME=... # the root directory of Spark

sh $ANALYTICS_ZOO_HOME/bin/spark-submit-python-with-zoo.sh --master local[4] tf_optimizer/train_mnist_keras.py
bash $ANALYTICS_ZOO_HOME/bin/spark-submit-python-with-zoo.sh --master local[4] tf_optimizer/train_mnist_keras.py
```

## Run the Evaluation Example using TFPredictor after pip install
Expand All @@ -161,7 +164,7 @@ python tf_optimizer/evaluate_mnist_keras.py
export ANALYTICS_ZOO_HOME=... # the directory where you extract the downloaded Analytics Zoo zip package
export SPARK_HOME=... # the root directory of Spark

sh $ANALYTICS_ZOO_HOME/bin/spark-submit-python-with-zoo.sh --master local[4] tf_optimizer/evaluate_lenet.py
bash $ANALYTICS_ZOO_HOME/bin/spark-submit-python-with-zoo.sh --master local[4] tf_optimizer/evaluate_lenet.py
```

For Keras users:
Expand All @@ -170,5 +173,5 @@ For Keras users:
export ANALYTICS_ZOO_HOME=... # the directory where you extract the downloaded Analytics Zoo zip package
export SPARK_HOME=... # the root directory of Spark

sh $ANALYTICS_ZOO_HOME/bin/spark-submit-python-with-zoo.sh --master local[4] tf_optimizer/evaluate_mnist_keras.py
```
bash $ANALYTICS_ZOO_HOME/bin/spark-submit-python-with-zoo.sh --master local[4] tf_optimizer/evaluate_mnist_keras.py
```