Posts

Showing posts from August, 2017

Machine Learning Optimizes FPGA Timing

Image
By Bernard Murphy (*)

Machine learning (ML) is the hot new technology of our time so EDA development teams are eagerly searching for new ways to optimize various facets of design using ML to distill wisdom from the mountains of data generated in previous designs. Pre-ML, we had little interest in historical data and would mostly look only at localized comparisons with recent runs to decide whatever we felt were best-case implementations. Now, prompted by demonstrated ML-value in other domains, we are starting to look for hidden intelligence in a broader range of data.


One such direction uses machine-learning methods to find a path to optimization. Plunify does this with their InTime optimizer for FPGA design. The tool operates as a plugin to a variety of standard FPGA design tools but does the clever part in the cloud (private or public at your choice), in which the goal is to provide optimized strategies for synthesis and place and route.

There is a very limited way to do this toda…

SETI's FPGAs

Image
FPGAs and GPUs: a Tour of SETI's Computer Hardware


David MacMahon is a research astronomer with Berkeley SETI Research Center. Dave works on several projects at BSRC, including Breakthrough Listen, designing many of the computer systems we use to process data collected from our telescopes. If you've ever been curious what hardware is required to search for ET, check out this tour of Berkeley SETI behind the scenes.



Slight Street Sign Modifications Can Completely Fool Machine Learning Algorithms

Image
Machine Learning is a hot I+D topic.

One of the strategies used in Machine Learning is to learn by means of neural networks. You can get a free introduction to neural networks here.
I also warmly recommend Andrew Ng's introductory course to Machine Learning on Coursera.

Machine Learning neural networks were inspired by biological neural networks, and are easily applied but highly effective in image processing algorithms, like handwritten text recognition.

More complex neural networks algorithms are being implemented on what is called Deep Machine Learning, using neural networks with many layers of complexity.

Typically a neural network is trained, or it learns, from its exposure to thousands of 'good' and 'bad' examples of the image to be recognized or classfied. For example, a neural network that has to recognize handwritten numbers, will be exposed to thousands of examples of numbers written by different people, and even with changes in the orientation of the te…