1.1.3 Neural Networks and Deep Learning
Deep refers only to the depth of layers and saysnothin about the meaning of the results achieved through this process.
Its important to know some other machine learning techniques
- This is how to estimate the probability of an event given
- The a priori belief of how likely the event is
- The observed facts related to the event
- This is used to classify data given observed facts which are assumed to be mutually exclusive
- A classification technique that can be used often to get a feel for the task at hand.
- SVMs (Support Vector Machines) are the best examples, and tackle binary classification problems by mapping original data into higher dimensionality and finds a transformation that maximizes the distance called a margin between the the two.
- These predict outputs given inputs in a flowchart like structure, this yes no output is followed until then end of the flow chart is reached and is easy for humans to understand.
Around 2012 the solution to the ImageNet problem has made deep neural networks the go to solution for this problem. They work generally on all perceptual tasks for example speech recognition. Deep learning is makes problem solving easier by automating the most dificult part of machine learning called feature engineering.
Feature engineering required scientists to make data moreamenable to SVMs and Decision Trees where Deep Learning automates this step. Two esential characteristics of how deep learning learns from the data is the layered way in which more and more complex layers of representation are developed and the fact tht each of these layers is learned jointly and updated to follow the needs of the layer above and the layer below.
1.1.4 Why Deep Learning Now?
- Datasets and Benchmarks
- Algorithmic Advances Matrix multiplication and addition used in deep learning is highly paralelizable and can leverage GPUs without these it would be impossible to train many DNNs.
Common benchmarks and public competition have also helped the rise of deep learning.
A key issue was gradient propogation through deep layers. The feedback signal would fade as layers increased. This was solved by - Better activation functions
- RELU - Better weight initialization - Glorot initialization - Better optimization schemes
- RMSProp and ADAM otimizers
1.2 Combining Deep Learning and JS
Combining deep learning and the web browser can create unique opportunties. Once a model is trained it must be deployed somewhere to make predictions on real data. Without deployment training a model is a waste. This means the browser is a logical place to deploy a machine learning model if the model expects the kind of data available in the browser. It also has the added benefits of reduced server cost lower inference latency, privacy, instant GPU acceleration, and instant access. - Reduced Server Cost - The overhead of downloading a model can be reduced by caching and local storage capabilities. - Lower Inference Latency - Applications that involve realtime audio demand the model be deployed to the client side. - Instant WebGL Acceleration - The browsers WebGL API can be leveraged to accelerate neural networks - It is also possible to train or fine-tune NNs purely inside the web browser - Instant Access - Browser applications have the advantage of zero install
SIMD vs. SISD
A GPU’s SIMD process over large amounts of data will out perform a CPU’s SISD method even though a CPU might process individual results faster.
1.2.1 Deep Learning in Node.js
The tfjs-node library also binds to the compiled native Tensorflow code from C++ and CUDA. This makes JS a suitable language for machine learning even when training large models.
Node.js x Tensorflow shines in two areas performance and compatibility with existing skill sets. When the model is small enough for the language interpreter performance to be a factor Node.js’s JIT V8 engine is faster than python.
1.3 Why Tensorflow.js
You have to ⛏ something..
1.3.1 Tensorflow, Keras, and Tensorflow.js
This name reflects what the library does data representations called tensors flow through various layers in order to solve a task.
Tensors are just multidimensional arrays
In DNNs every piece of data and every result is represented as a tensor. A grayscale image is a 2D array of numbers while a color image would be a 3D array of numbers. The section on using lists in functional abstraction might have some relation here. Tensors have two properties data type and shape. Shape describes a tensor’s size along all its dimensions.
The flow part stems from the fact that tensors pass through a graph – a network of connected mathemeatical operations called nodes. A node can be several layers and takes tensors as inputs as produce tensors as outputs. The tensors can change both shape and value as it flows through the graph.
Most deep learning benefits from a small quantity of layer types. In tensorflow these layer types are formed by the Keras API. Keras also allows a developer to configure how the model/network will be trained. Feeding data to the model for taining and inference. Monitoring. Saving and loading models. Plotting the architecture of models. A full deep learning workflow can be created with few lines of code using this API. The power and usability of Keras is available in Tensoflow.js and it can also import models from the Python implementation of Keras and Tensorflow.
The lowest levels of the Tensorflow.js API are responsible for fast parallel processing. Above that is an Ops API which mimics the tensorflow low level API and above that is the Layers API which is mostly what developers would use.
1.3.2 Tensorflow.js Comparison to Other Libraries
- It is the only library to capture the complete deeep learning workflow.
- Supports inference & training
- Supports web browsers and Node.js
- Supports GPU acceleration (WebGL & CUDA)
- Supports defining neural net model architectures
- Supports conversion to and from python libraries
- Supports python API
- Supports data ingestion and data visualization
Community support is also a huge advantage of working with Tensorflow.js
1.3.3 How Tensorflow is Being Used
Music, out of the box transfer learning, translation. ### 1.3.4 Summary The central problem of transforming data representations into ones more amenable to solving the tsk is the core of what Tensorflow.js enables using .