The Raspberry Pi has been at the forefront of single-board computers (SBCs) for quite some time now. However, nearly four years after the launch of the Raspberry Pi 4, a new model is on the horizon.

Previous Raspberry Pi iterations generally included faster processors, more RAM, and better IO with the Pi 4. However, a lot of the Pi is used for AI (artificial intelligence) and ML (machine learning) purposes, leading to a lot of speculation from DIY enthusiasts about the Raspberry Pi 5’s built-in machine learning capabilities.

Which CPU will the Raspberry Pi 5 get?

Whether or not the Raspberry Pi 5 has built-in machine learning capabilities depends on what CPU the board is based on. Raspberry Pi co-founder Eben Upton teased the future of custom Pi silicon back at TinyML Summit 2021. Since then, there is very likely an imminent Raspberry Pi 5 release with massive improvements to ML.

Until the Raspberry Pi 4, the development team was using ARM’s Cortex processors. However, with the release of the Raspberry Pi Pico in 2021 came the RP2040, the company’s first in-house SoC (system-on-chip). While it doesn’t have as much power as the Raspberry Pi Zero 2W, which is one of the cheapest SBCs on the market, it offers microcontroller capabilities similar to an Arduino.

The Raspberry Pi 2, Pi 3, and Pi 4 use ARM’s Cortex-A7, Cortex-A53, and Cortex-A72 processors, respectively. This has increased the processing capabilities of the Pi in each generation, giving each progressive Pi more ML prowess. So does this mean we’ll see machine learning built-in on the Raspberry Pi 5’s CPU?

While there’s no official word on what processor will power the Pi 5, you can be sure that it will be the most ML-capable SBC in the Raspberry Pi lineup and will likely have built-in ML support. The company’s Application Specific Integrated Circuits (ASIC) team is working on the next iteration, focused on lightweight accelerators for ultra-low power ML applications.

Upton’s talk at the TinyML Summit 2021 suggests that this could come in the form of lightweight accelerators, which can run four to eight multiply-accumulate (Mach) per clock cycle. The company has also worked with ArduCam on the ArduCam Pico4ML, which brings together the ML, a camera, microphone, and a screen in a pico-sized package.

While not all details about the Raspberry Pi 5 have yet been confirmed, if Raspberry Pi sticks to its trend of incrementally upgrading its boards, the upcoming SBC could be a useful board that ML enthusiasts and developers are looking for. Will check a lot of boxes for. Looking for cheap hardware for your ML projects.

Raspberry Pis can be a lot of fun

Raspberry Pi 5 could come with built-in machine learning support, opening up a plethora of opportunities for anyone looking to build their own ML applications with hardware that’s finally able to keep up with the technology without breaking the bank .

You can run anything from a Large Language Model (LLM) to a Minecraft server on an already existing Raspberry Pi. As SBCs become more capable (and accessible), so will the possibilities for what you can do with a credit-card-sized computer.

Big language models, commonly (and wrongly) known as AI, have been threatening to upend the publishing, art and legal worlds for months. One downside is that using an LLM like ChatGPT means creating an account and doing the work on someone else’s computer. But you can run a trained LLM on your Raspberry Pi to write poetry, answer questions, and more.

What is a big language model?

Large language models use machine learning algorithms to find relationships and patterns between words and phrases. Trained on large amounts of data, they are able to predict which words are statistically likely to appear when prompted.

If you ask thousands of people how they are feeling today, the response will be, “I’m fine”, “Could be worse”, “Fine, but my knee is playing up”. The conversation would then turn in a different direction. Perhaps the person will ask about your own health, or follow up with “Sorry, I’ve got to rush. I’m late for work.”

Given this data and an initial prompt, a large language model should be able to come up with a concrete and original answer of its own, based on the probability of a certain word appearing in a sequence, with a predetermined degree of randomness. Combined, iteration penalty, and other parameters.

The large language models in use today are not trained on vox pops of a few thousand people. Instead, they are given unimaginable amounts of data scraped from publicly available archives, social media platforms, web pages, archives, and sometimes custom datasets.

Leave a Reply

Your email address will not be published. Required fields are marked *