I nearly had a minor disaster on my hands a few weeks ago...
I was traveling to Montenegro for the first time and I needed some guidance.
I'd never been there before. As always when I visit a new place, I wanted to learn a bit about the country's cultural norms, infrastructure, and history.
Whenever possible, I try to use so-called artificial intelligence ("AI") to help me out with that kind of stuff.
I didn't really think I needed to ask about the airport. I had already been informed that Tivat Airport had just one terminal.
Out of curiosity though, I decided to ask AI tool ChatGPT how many terminals there were at Tivat. It told me two.
This made no sense. I had it on good authority there was only one terminal. That's what my usual travel resources told me.
And when I opened a "fresh" instance of ChatGPT – a new thread with no context of my past interactions – it changed its answer to one terminal.
Folks, while artificial intelligence may be smarter than any other technology we've seen... it's still something of a misnomer today.
And if you can't rely on AI for accurate travel information, you have no business trusting it with your money...
For the record... Tivat Airport does, in fact, have two terminals.
So the "primed" ChatGPT, which had answered my other questions about Montenegro, gave me the correct answer.
The problem was the lack of consistency.
The second terminal at Tivat is almost 20 years old. There are a ton of resources on the web that still claim Tivat Airport has one terminal. A lot of those resources haven't been updated for years.
You need to understand that, despite the name, artificial intelligence is not intelligent. True intelligence wouldn't give two different people (or even one person) different answers to the same question.
These models are trying to recognize patterns based on all the information they're trained on. Left to its own devices, AI is more like "guidance" than intelligence.
You need to guide it just as much as it guides you. It's talking out of both sides of its mouth. Funny enough, the word "artificial" doesn't just mean unnatural... it also means insincere.
There are ways to increase your odds of getting good guidance. For example, if you ask a fresh instance of ChatGPT, "Based on the most recent information you have access to, how many terminals are there at Tivat Airport?"... you might be able to get the correct answer of two.
However, you have to be extremely careful with what you're asking and how you're asking it.
And that's why I worry about folks who take it a step further... and turn to AI as their sole source of financial advice.
AI is still far from replacing investment professionals...
And yet, if you search through ChatGPT's website, you'll see dozens of alleged finance and investing GPTs.
"Finance Wizard" claims to be an AI analyst that predicts future stock market prices... which is impossible.
Another one, called "Financial Advisor," claims to offer retirement planning in the U.S. If you ask it how much money you should put in different types of investments, all it does is parrot the "100 minus your age" rule between stocks and bonds.
The point is, AI is not going to ask you for context. It's not going to look for more data until it's sure it's giving you the best information possible.
Financial Advisor won't ask you your age... when you'll need the money... or any questions about your risk tolerance. It's certainly not assessing current market data to tell you how fast you should buy into the stock market.
At this point, AI draws patterns based on the information it's trained on... and nothing more.
You have the tools to make AI work better. You can give it more context. And if used right, it can be a valuable addition to your investing toolbox.
However, you should always take its advice with a grain of salt. The best person to look out for your financial interests is you... and AI hasn't changed that.
Wishing you love, joy, and peace,
Joel
August 2, 2024