Facebook’s fastText library is now optimized for mobile devices

  •   Ricky
  •   July 31, 2017
  •   154
  •   0
  • Photo Courtsey:
Loading...
Facebook’s fastText library is now optimized for mobile devices

Facebook’s AI research lab has updated fastText. The number of languages supported by the library is getting tripled along with enhancements that bring reductions to the model size and memory demand.

In case you are unaware of fastText, it is a fast, open source text classification library that makes it easy for developers to come up with tools that make of language analysis. This can come in handy in situations where the language of the content needs to be understood and analyzed. For instance, if you are building a tool that will recognize and thwart clickbait headlines or filter spa, an understanding of the language is required,

The library was first released with support for 90 languages including per-trained word vectors. The number has since gone up and today’s update takes it to 294. However, while the team designing the library wanted it able to run on a wide variety of hardware the requirement of a few GBs of memory ensured that the library could not be run on mobile.

That is set to change too. Facebook has now reduced the memory requirements to just a few hundred KBs. They were able to do this by optimizing the comparison of a bunch of vectors and thus reducing the memory demands.

Facebook’s AI research lab has updated fastText. The number of