Tag: Local-Ai

3 entradas encontradas

IA Local en Raspberry Pi 5 con Ollama: Tu servidor de IA privado en casa

IA Local en Raspberry Pi 5 con Ollama: Tu servidor de IA privado en casa

5 min de lectura

Hace unos meses me topé con algo que realmente me llamó la atención: la posibilidad de tener mi propio “ChatGPT” funcionando en casa, sin enviar datos a ningún lado, usando únicamente un Raspberry Pi 5. Suena demasiado bueno para ser verdad, ¿no?

Pues resulta que con Ollama y un Pi 5 es perfectamente posible montar un servidor de IA local que funciona sorprendentemente bien. Te cuento mi experiencia y cómo puedes hacerlo tú también.

Local AI on Raspberry Pi 5 with Ollama: Your private AI server at home

Local AI on Raspberry Pi 5 with Ollama: Your private AI server at home

5 min de lectura

A few months ago I came across something that really caught my attention: the possibility of having my own “ChatGPT” running at home, without sending data anywhere, using only a Raspberry Pi 5. Sounds too good to be true, right?

Well, it turns out that with Ollama and a Pi 5 it’s perfectly possible to set up a local AI server that works surprisingly well. Let me tell you my experience and how you can do it too.

LM Studio Removes Barriers: Now Free for Work Too

LM Studio Removes Barriers: Now Free for Work Too

5 min de lectura

In my years developing software, I’ve learned that the best tools are those that eliminate unnecessary friction. And LM Studio has just taken a huge step in that direction: it’s now completely free for enterprise use.

This may sound like “just another AI news item,” but for those of us who have been experimenting with local models for a while, this is an important paradigm shift.

The problem that existed before

Since its launch in May 2023, LM Studio was always free for personal use. But if you wanted to use it in your company, you had to contact them to obtain a commercial license. This created exactly the type of friction that kills team experimentation.