It's taken me a while to come around, but I've become a fan of certain AI tools -- when used for specific purposes. I've even found some of those tools to be very helpful throughout my day (so much so that I haven't used Google's search engine in weeks).
That, my friends, is refreshing.
How I got here was a bit circuitous. I started out 100% against AI but then I realized I was against AI when used as a shortcut for things like writing and other artistic endeavors. Once I realized AI was very good at helping me research different areas (where I'd previously used a search engine), I adopted it into my process.
Also: How I made Perplexity AI the default search engine in my browser
When I'm working on something, if I need to understand a concept (such as the weight of NASA's Crawler), I'll use AI. Inevitably, when I get my answer, there are other rabbit holes to follow, which leads to me learning something.
I love to learn.
But what tools am I using, and how do they get worked into my processes? Why don't I just tell you?
This is the combination I use the most. Ollama is a command-line AI tool that allows me to use specific LLMs locally, so I can trust my queries don't wind up on a third-party server, waiting to be used for whatever purpose. The LLM I use the most with that combination is Llama 3.2 because it's fast and to the point. The answers I get from Llama 3.2 trim all the fat, so I get only the answers I need.
Also: How I feed my files to a local AI for better, more relevant responses
Msty is a front-end for Ollama, which means I can use a handy GUI instead of the command line. Although the command line is second nature to me, when I'm working on my local desktop, I'd prefer a GUI. The good news is that Msty is available for Linux, MacOS, and Windows, so it doesn't matter what OS I'm using -- Ollama/Msty is there to help me with my research.
Remember when I said that Llama 3.2 got right to the point with its answers? Every so often, I need more from my research assistant; I need it to dig deeper and present even more rabbit holes for me to dive into. I might be writing a book about time travel and need to get a foundational understanding of how it could work. With the right information, that could lead me down other avenues where I could come up with something even better for a plot. That's when I turn to DeepSeek R1. When I need something quick and easy, if I were to use DeepSeek R1, I'd wind up frustrated.
Also: How to run DeepSeek AI locally to protect your privacy - 2 easy ways
Why? DeepSeek R1 is really long-winded. It's like your friend telling you a story but adding so much detail that it takes forever to get to the point. When I need lots of detail, I switch LLMs in Msty from Llama 3.2 to DeepSeek R1. I might have to spend considerably more time waiting for the response and combing through it, but when detailed information is required, it's worth every second.
Perplexity has become my default search engine in Zen Browser. Why did I do this? First off, I dislike and distrust Google's search engine. There's no way of knowing if my data is being used against me for building a profile, and Google's algorithm never seems to get it right for me. Besides, when I use a search engine for research purposes, I inevitably wind up on a site that is either so overrun by ads that it brings my browser to a crawl, has incorrect or out-of-date information, or it's painfully obvious that Google would rather serve me sites that pay the company rather than give me the answers I need.
Also: What is Perplexity Deep Research, and how do you use it?
Perplexity makes for a much better replacement because I get the answers I need -- without the filler, ads, or poorly coded pages.
Although I switched to Zen Browser as my default, I still use Opera for certain purposes. For instance, when I'm working on something and I know I'll need to keep going back to AI for research, I might open that work in Opera and use Aria as needed. The reason for this is that Aria is easily accessible from the Opera sidebar. I can click the Aria icon, and a panel opens, where I can run my query. When I'm done, the panel slides out of the way, and I'm back to work. The Opera/Aria integration is the best AI/browser combo on the market.
Although I've installed Perplexity on my Android phone, it cannot be used as the default digital assistant -- that's Gemini's job.
Also: The best AI chatbots of 2025: ChatGPT, Copilot, and notable alternatives
When I'm on the go, if I need to do some quick research, I'll ask Gemini my query or use it to set alarms, add appointments to my calendar, send messages, etc. If you're an Android user, it's hard to avoid using Gemini. I have also used Gemini Live, as it makes for a great way to work through problems. Usually, I'll turn to my wife or my best friend, but when I need absolutely unbiased information, Gemini Live is a great place to start.