Disclaimer: As AI is changing rapidly and we are still finding novel applications at an alarming rate it’s worth noting the date in which this was written. Facts and opinions change quickly in this field at this time.
Things are progressing. I find myself putting my math and coding to practice with my role at the financial tech firm. Right now, I am getting to grips with their AI system and all I can say is that there’s more to implementing machine learning than understanding the theory and coding it. Seemly boring processes like data cleaning, storage, and processing can make or break the system, not to mention the documentation. Luckily for me, I enjoy these things but it’s easy to see how some who are only interested in the academia of machine learning does not create a good overall system.
This transition has led to many conversations about what I am doing and where I am going, especially from my clinical colleagues. I find these conversations to be fascinating, as you can get to the bone of what they think and feel about artificial intelligence and machine learning. One thing that I came across was the “AI effect”. It’s a similar relationship between science and religion. Before the scientific method, god was everywhere and was the reason behind everything. When Haley used the scientific method to predict a comet’s position in the sky at a particular time, it showed the world could be understood and predicted. Before, comets and stars were signs sent from the gods that usually preceded an event. In fact, the word disaster comes from the premise, bad star. We’ve come a long way since Haley, and orthodox religion keeps moving back the goal posts. Before, religious scripture was there to inform us on how the world worked, our moral codes, and what we should do. Nowadays most religious people take their medical advice from scientific findings. Religion in the west moved to moral bearings, and what is right from wrong. Although the bible and other religious texts made it very clear that homosexuality is a sin, science, reason, and logic, in general, has concluded that homosexuals are just like everyone else, and they deserve equal rights. Now religious moderates have pushed back religion even more to imaginative interpretations, with the increasing emphasis that religion is more about finding yourself, improving your life with the love of god, and other vague statements. The last line of defense is to start asking questions about the person asking questions. It’s a fault with them that they cannot understand vague statements and moving goal posts. If you love your data, you would take the derivative of what religion claims to give guidance on with respect to time. With this negative rate of change, it’s not crazy to conclude that religion is dying in the west.
What the scientific method did to religion, machine learning and artificial intelligence are doing to the concept of intelligence. The AI effect is observed when AI makes a big breakthrough. Like with the role of religion, the definition of intelligence keeps getting pushed back. When AI dominated chess 20 years ago, people said that is was brute force algorithms, not real intelligence. Then we had the advancements of neural networks. These were inspired by the architecture of neurons in the brain. They weight certain inputs, calculate their strength, and if they are high enough they go through to the next node. This enabled the algorithm to process multiple binary inputs, find patterns, and weight them. There is little debate now that machine learning algorithms are good at identifying patterns. If you break down a lot of higher education, it’s asking the right questions so you can find the patterns effectively, and then use these patterns to improve an outcome. Pattern recognition is widely taught in medical schools, the standard interpretation of an ECG is pattern recognition, very few clinicians look at the ECG lines as vectors in time and space. They look at patterns with respect to presenting complaints past med history. We are now starting to say that pattern recognition is not true intelligence. It’s all about understanding why these patterns occur. Here we are already seeing the utility of intelligence dropping. The vast majority of people see patterns, isolate the variables and change them to favor the outcome. People would like to think that they have a deeper understanding but it’s not as common as people think. For instance, I remember talking with an anesthetics doctor in the ED. We were talking about aortic aneurysms. I had recently studied fluid dynamics for my physics undergrad and I excitedly told him that it was the same mathematics to why planes stay up in the air. Due to the flow and diameter, the aneurysm would be at higher pressure. He looked at me blankly, got a little shirty and told me it wasn’t true. However, he couldn’t have a conversation with me about it, all he could tell me was that aneurysms were bad and that they needed to be repaired, people who have them have a higher chance of dying, and he could reel off the procedures and variables he had to tweak to improve the outcome. He could give me shallow explanations as to why he did these things but he certainly didn’t understand them in any depth. He knew enough to weigh up multiple variables and tweak them to improve the outcome. It’s not that he was stupid, I’m sure if he spent the time learning the extra theory he would understand it, it’s that he could do his job well without that knowledge. I’ve heard multiple doctors rush to refute this conclusion. They list multiple other variables that the doctor has to take into account. The issue here is that they are not improving the depth, they are merely reeling off more shallow variables that they have to consider. In fact, it’s the multiple shallow variables that have facilitated such quick advancements in AI and healthcare. To me the doctor was intelligent, it’s just that AI is becoming intelligent in itself.
We are still in the initial stages of AI, so naturally, a lot of emotion and the personal offense will rise from the implementation of AI. The same happened with religion. A small band of independent thinkers and scientists told the rest of world that they were not special and that a god did not create the world for them. Now it’s the computer scientists time to tell the world that they are not smart. The acceptance will grow in time. More and more members of the western civilization are living with the fact that there isn’t an omnipresent god/parent who cares for them, thinks they are special, and that their life doesn’t have any predefined meaning. In time, more and more people will come to terms with the fact that a computer can be smarter than them. As I have studied more math, physics, and computing, the more I realize that genius is overrated. Yes, we need truly intelligent people, however, the world is in a much shorter supply of courage, hard work and practical application of knowledge.
I help clinicians get to grips with coding and tech, I also code for a financial tech firm