Sitemap

Member-only story

SOLVED : None of PyTorch, TensorFlow >= 2.0, or Flax have been found

3 min readMay 16, 2025
TensorFlow — PyTorch — Flax

When working with the Hugging Face transformers library, you may encounter a message like:

“None of PyTorch, TensorFlow >= 2.0, or Flax have been found. Models won’t be available and only tokenizers, configuration and file/data utilities can be used.”

This message isn’t an error — it’s a warning. However, it highlights an important detail about how the transformers library functions under the hood and what capabilities are currently available in your Python environment.

The transformers library was designed to be backend-agnostic, meaning it can operate using different deep learning frameworks, including PyTorch, TensorFlow (version 2.0 or newer), and Flax. These frameworks provide the computational backbone needed to run and fine-tune pretrained language models like BERT, GPT, T5, and many others.

If none of these frameworks are installed in your environment, the transformers library will still be partially functional. You will still be able to work with tokenizers, model configuration files, and utility functions for handling files and datasets. However, you won’t be able to actually load and run models — whether for inference or training — because the computational logic depends entirely on one of the aforementioned libraries being present.

--

--

Johan Louwers
Johan Louwers

Written by Johan Louwers

Johan Louwers is a technology enthousiasts with a long background in supporting enterprises and startups alike as CTO, Chief Enterprise Architect and developer.

No responses yet