# HOW I GOT KERAS + TENSORFLOW WORKING ON MY MAC OS 10.12.3¶

Keras ( https://keras.io/ ) is a high-level neural networks API, written in Python and capable of running on top of either TensorFlow or Theano.

Tensorflow ( https://www.tensorflow.org/ ) is the most popular python framework for deep neural network deployment.

In this notebook I go over the steps I took to install Keras and Tensorflow on Mac OS 10.12.3. This was written March 24, 2017. If you follow these steps, chances are good you'll have a working environment to experiment with Deep Learning. Let's start with Anaconda installation.

### 1. Install Anaconda with Python 2.7.X.¶

Anaconda ( https://www.continuum.io/downloads ) is a distribution of Python that includes many of the most popular Python, R and Scala packages for data science. Installation instructions are at the provided link. NOTE: I used the graphical installer and ended up with an installation of python in ~/anaconda

### 2. Install virtualenv using pip install.¶

"A Virtual Environment is a tool to keep the dependencies required by different projects in separate places, by creating virtual Python environments for them. It solves the “Project X depends on version 1.x but, Project Y needs 4.x” dilemma, and keeps your global site-packages directory clean and manageable." - Kenneth Reitz

I used a virtual environment to install tensorflow. It's probably a good idea for you, too. Reasonable instructions for installation (as well as use) of virtualenv and virtualenvwrapper can be found at the following link: http://docs.python-guide.org/en/latest/dev/virtualenvs/

pip install virtualenv

#### 2b (optional). Install virtualenvwrapper.¶

NOTE: Some people tell you to install virtualenvwrapper to make it "easier" to use virtualenv. I found it not to be easier, but harder. I suggest staying with virtualenv until you have a reason to "upgrade" to virtualenvwrapper.

Note: The instructions at the link above (in section 2) have the wrong location for virtualenvwrapper.sh if you installed on Mac OS. You can find where this file is located by doing this on the command line:

find / -name virtualenvwrapper.sh

The file turned up here on my system: /Users/bryan/anaconda/bin/virtualenvwrapper.sh

### 3. Downgrade to Python 2.7.9 to avoid the conda/virtualenv conflict present in Python 2.7.10 and higher¶

There is a conda conflict with virtualenv in anaconda installs that use Python 2 versions above 2.7.9. This seems to cause problems that manifest as an error message with pip wheel if you try and make a virtual environment using virtualenvwrapper. They manifest with another error if you try and use virtualenv directly. The good news is it is easy to change the version of Python in your Anaconda install. Here's how I did it:

conda search python

This gave me a list of available versions. I then did this:

conda install python=2.7.9

This installed the version that gave no conflict between pip install and virtualenv/virtualenvwrapper

Here's a link that gives more details on the process of specifying a python version: https://conda.io/docs/py2or3.html

### 4. Create a virtual environment for tensorflow with virtualenvwrapper¶

Type this at the command line:

mkvirtualenv tensorflow

This made a copy of my python installation in the directory ~/tensorflow

Once this is done, and you get into the virtual environment (see the next step), everything you install will be in this ~/tensorflow directory. As long as you are in the virtual environment, this is where python will look for all its tools.

### 5. Get into the virtual environment¶

Type this at the command line:

source activate tensorflow

You can tell if you are in the virtual environment because your command line prompt will change. Before I type "source activate tensorflow" my command line prompt looked like this:

Bryans-MacBook-Pro:

After typing, it looks like this:

(tensorflow) Bryans-MacBook-Pro:

To get out of this virtual envorinment again (Don't do it yet. I'm just telling you how to do it once you're all done installing and testing everything.), just type this:

source deactivate

### 6. Install tensor flow in this virtual environment¶

Now that you're in the virtual environmnet, install tensorflow.

pip install tensorflow

### 7. Install keras in this virtual environment¶

pip install keras

### 8. Install ipython in the virtual environmnet¶

Yes...I know, you already have ipython and jupyter installed with your anaconda install...but they won't work with tensorflow and keras if you don't reinstall them in this virtual environment you've created.

pip install ipython

### 9. Install jupyter in the virtual environment¶

pip install jupyter

### 10. Start up jupyter notebook and open up a notebook that lets you test your installation.¶

jupyter notebook

This is the point where you'd get the ipython notebook version of this page and open it.

### 11. Verify your tensorflow installation.¶

Verify your tensorflow install worked by running the code block below. IF you see "Hello, Tensorflow"when you run the following code block, then you should be OK.

In [1]:
import tensorflow as tf
hello = tf.constant('Hello, TensorFlow!')
sess = tf.Session()
print(sess.run(hello))

Hello, TensorFlow!


### 12. Now verify that Keras installed OK by making a little neural net and training it.¶

The code block below makes a little network, some dummy data and then trains the network for a few epochs. If you see something like the following output after running the code block below, you're in good shape:

Epoch 1/10 1000/1000 [==============================] - 0s - loss: 0.7134 - acc: 0.4840

Epoch 2/10 1000/1000 [==============================] - 0s - loss: 0.6995 - acc: 0.5130

(etc....I'm not going to show you the whole list of epoch outputs)

In [1]:
# Make a single-input model with 2 classes (binary classification):

from keras.models import Sequential
from keras.layers import Dense, Activation

model = Sequential()
model.compile(optimizer='rmsprop',
loss='binary_crossentropy',
metrics=['accuracy'])

# Generate dummy data
import numpy as np
data = np.random.random((1000, 100))
labels = np.random.randint(2, size=(1000, 1))

# Train the model, iterating on the data in batches of 32 samples
model.fit(data, labels, epochs=10, batch_size=32)

Using TensorFlow backend.

Epoch 1/10
1000/1000 [==============================] - 0s - loss: 0.7116 - acc: 0.5100     
Epoch 2/10
1000/1000 [==============================] - 0s - loss: 0.6934 - acc: 0.5430     
Epoch 3/10
1000/1000 [==============================] - 0s - loss: 0.6862 - acc: 0.5490     
Epoch 4/10
1000/1000 [==============================] - 0s - loss: 0.6837 - acc: 0.5510     
Epoch 5/10
1000/1000 [==============================] - 0s - loss: 0.6787 - acc: 0.5710     
Epoch 6/10
1000/1000 [==============================] - 0s - loss: 0.6759 - acc: 0.5830     
Epoch 7/10
1000/1000 [==============================] - 0s - loss: 0.6735 - acc: 0.5770     
Epoch 8/10
1000/1000 [==============================] - 0s - loss: 0.6705 - acc: 0.5750     
Epoch 9/10
1000/1000 [==============================] - 0s - loss: 0.6692 - acc: 0.5890     
Epoch 10/10
1000/1000 [==============================] - 0s - loss: 0.6642 - acc: 0.6000     

Out[1]:
<keras.callbacks.History at 0x107625bd0>
In [ ]: