Is it Time to Lean into AI?

As developers, we are constantly exposed to new tools and technologies, but the most recent wave of AI advancements feels different. In this post, I'm going to share my thoughts on whether or not the hype around generative AI is real.

Aaron Bos | Sunday, April 16, 2023


Artificial intelligence has been around for a long time both in theory and in practicality. Alan Turing developed the Turing Test as early as 1950 and since then there have been gradual advancements in the space of artificial intelligence and machine learning. As AI technology has evolved we've found ways to integrate it into our lives through games like chess, search engines, recommendation systems, autonomous vehicles, and much more. So if all of this technology has been in place for so long what makes the recent advancements in generative AI and around OpenAI's ChatGPT different? Is there any substance behind all of the hype that it has received or is the hype not real? Honestly, I don't have answers to these questions, but I do have some thoughts that I'd like to get out of my head.

The recent advancements in generative AI, that is artificial intelligence capable of creating content, feel different to me. For the first time that I can remember we can ask machines to create content like code, stories, images, videos, music, etc. from nothing. At least it seems like it is creating it from nothing. It's my understanding that the underlying language models have been trained on so much data from the web that they're able to translate just about any request into a response that feels unique.

In the past year, we've seen the introduction of ChatGPT and GitHub Copilot, which both seem to be revolutionary tools in their own right. OpenAI's ChatGPT is built with massive language models that can provide "human-like" responses to text prompts. The free version of ChatGPT is using GPT-3.5 which is impressive but is limited in the number of tokens it can store which limits its effectiveness. The latest release of ChatGPT with GPT-4 appears to improve upon the shortcomings of its predecessor, which makes it potentially more useful. I haven't had a chance to experiment with ChatGPT using GPT-4 yet since it is behind their subscription, but from a software engineer's perspective I think this video by Nick Chapsas does a great job of demonstrating its power.

With each release and advancement in this space, the question that keeps coming up for knowledge workers (like software engineers) is "Should we be worried about our jobs?". I think that everyone has their own opinion on this, but my take is that the time will eventually come when software engineers, at least as we view them today, will no longer be needed. However, I don't think that time will come for a long time. I think in the meantime, knowledge workers will find it beneficial to begin adopting AI-powered tools and use them to increase their productivity. By doing so we make ourselves more valuable to the companies that we work for, but also potentially continue to push the needle on the level of abstractions that we operate at. Currently, most software engineers spend a lot of time writing and reading code that gets packaged and deployed for use by other developers, servers, or end users. What if the tools provide an opportunity to begin composing systems and applications at a layer higher than the code? I'm not sure if this is possible, but I think it's where things could be headed.

I think it's important to mention that all of the great advancement doesn't come without risk. The rapid development of some of these tools can result in bugs, vulnerabilities, and unforeseen exploits. We've seen API keys and credentials get leaked in GitHub Copilot. ChatGPT can hallucinate and provide completely incorrect answers to questions while appearing very confident in those answers. Users have also learned to use "prompt injection" attacks that result in the user's prompt being modified before being sent to the backend servers for the models to process.

I'll be honest, I've taken my time on joining the hype train that is gaining steam around AI-based tools, but there comes a time when the hype doesn't feel like hype anymore. For me, this is that time. I'm not planning to go off and build a startup around this technology, but I think it carries enough weight to at least begin familiarizing myself with the tools available and using them where I can.

Resources

https://simonwillison.net/2023/Apr/14/worst-that-can-happen/

https://changelog.com/podcast/534

https://podcasts.apple.com/us/podcast/ai-tools-today/id1602572955?i=1000608610109


ShareSubscribe
As always thank you for taking the time to read this blog post!