Skip to main content

Posts

Showing posts with the label Ollama

How To Run Hugging Face Models On Local CPU Using Ollama

Are you fascinated by the capabilities of Hugging Face models but unsure how to run them locally?  Look no further!  Here, we will explore the simplest and most effective way to get Hugging Face models up and running on your local machine using Ollama . For a complete walkthrough check out my latest video on "How to Run Hugging Face Models Locally Using Ollama".  This video covers everything from installation to running an example, ensuring you have all the information you need to get started: Happy coding!

Generating AI Model Responses in JSON Format Using Ollama and Llama 3.2

In the rapidly evolving field of artificial intelligence, generating accurate and contextually relevant responses is crucial. Ollama , a lightweight and extensible framework, combined with the powerful Llama 3.2 model, provides a robust solution for generating AI model responses in JSON format. This article explores how to leverage these tools to create efficient and effective AI responses. In case, if you are interested in knowing every single bit then here is my video recording: Setting Up Ollama and Llama 3.2 Before diving into the specifics of generating responses, it's essential to set up Ollama and Llama 3.2 on your local machine. Ollama offers a straightforward installation process, and you can download the necessary models from the Ollama library.  Import required packages In order to get started with code, first we need to import the required packages: from ollama import chat from pydantic import BaseModel Generating Responses in JSON Format JSON format is a structure...

Run Your OpenAI SWARM Agents Locally With Open Source Model - 100% 🆓

In this article we will see how we can run our agents locally which means we will be using OpenAI Swarm framework but still we will not be paying anything to OpenAI as we will not be utilizing OpenAI's API key. Using OpenAI's Swarm but not using OpenAI's key, got confused?  Well, we will be achieving this using Ollama :) Now, before we proceed, if you have not watched my earlier video on what is OpenAI-Swarm and how to get started with it, I would recommend you check this one here:  Here is link of the GitHub repository containing Swarm's source code and implementation details. What Are We Trying To Do? We will create our own agent utilizing  OpenAI-Swarm framework using Ollama and the open-source model Llama3.2:1b . This agent will run locally on our machine without the need to any API key from OpenAI . Setting Up The Things Install Swarm We need to install Swarm from GitHub as it is still in experimental stage and that can be done by running below command: pip...