Practical TensorFlow: 1-Day Intensive
The “Practical Tensorflow” is a 1- or 2-day training event, depending on the audience level and requested pace. It focuses on development of machine learning models using Google’s modern TensorFlow library for Machine Intelligence.
Teams will learn best practices for building, evaluating and deploying scalable data services using Python while exploring existing software libraries to help them save time and avoid reinventing the wheel.
The course will appeal mainly to data scientists or machine learning engineers with an interest in deep neural models.
All exercises and discussions will be adjusted to reflect the problems currently facing your business.
A combination of teaching and hands-on programming exercises will give learners the opportunity to apply, test and refine their knowledge, improving retention and building confidence through real-time feedback.
Who Should Attend?
The course is designed for machine learning developers, but is suitable for analysts and data scientists with a basic understanding of Python and previous experience with machine learning.
WHAT TEAMS WILL LEARN
- Develop machine learning models using Google’s TensorFlow machine intelligence library.
- Understand capabilities and limitations of TensorFlow applied to processing text documents.
- Understand the trade-off of CPU vs. GPU computations.
- Build deep models using word embeddings, convolutions, pooling, dropout, batch normalisation, recurrent models such as LSTMs, GRUs, and attention models.
- Debug the TensorFlow models for learning errors and computational efficiency.
Prerequisites and Recommended Background
Attendees are expected to be familiar with basic programming concepts and terminology (command line, shell, filesystem navigation, basic data structures and algorithms such as list or dictionary and basic Python syntax), as well as basic machine learning concepts (training, testing, cross validation, loss function).
In addition, participants will be expected to download and install data and software libraries as per our provided “Before You Arrive – Setup Sheet” in advance of the session. Delays due to installation issues on-site may adversely affect the day’s training schedule.
Each participant must have their own laptop, with a system that supports Python (OS X, Linux, Windows…), to participate in the interactive exercises throughout the training.
Please note: This syllabus can be customised to your specific needs, projects or areas of focus.We are happy to tailor course content and exercises to meet your specific business goals.
- Session 4: Optimisation
- Introduction to graph parameter optimisation
- Automatic differentiation
- Use of optimisers: SGD, ADAM, etc.
Learned skills:TensorFlow model parameter optimization
- Session 5: Construction (very deep) neural networks
- Introduction to TensorFlow neural network capabilities: tf.nn package
- Examples of use: word embeddings, convolutions, pooling, dropout, batch normalisation, (bidirectional) recurrent models (LSTM, GRU), attention models
- Structuring the TensorFlow models for readability: tfx, Scikit Flow, tflearn, Keras
Learned skills:constructing complex machine learning models based on deep neural networks.
- Session 6: Construction of optimisation losses
- TensorFlow defined losses: L2, sigmoid and softmax cross entropy, sampled NCE, sampled softmax
- Construction of custom losses: example of a loss for a set of binary classifiers and categorical classifiers
- Efficiency and accuracy of loss functions
Learned skills:knowledge of standard TensorFlow losses, construction of custom loss functions.
- Session 7: Efficiency and debugging of graphs and the learning process
- GPU vs. CPU computation on one machine with multiple CPUs and GPUs
- Debugging the efficiency of the computational graph: slicing, concatenation, reshaping matrices
- Optimisation GPU memory transfers: adding data to the computational graph as constants
- Visualisation of the learning process using TensorFlow
- Learned skills: debugging TensorFlow computational graph, visualisation of the learning process