Building an AI chatbot and the journey to deployment on Cloudflare

Originally published: 28 December 2023

This one is still eluding me, it's almost like coding can be hard sometimes

AI chats

Introduction

In the ever-evolving world of AI and web development, I’ve embarked on an ambitious project: creating an AI Chatbot, a Chat GPT clone, utilizing the OpenAI API. The concept is straightforward yet innovative - a website that extracts video transcripts from YouTube, pushes them to OpenAI, and transforms them into comprehensive blog posts. Fascinating, right? However, this journey hasn’t been a walk in the park.

The Challenges: Streaming Responses and Lengthy Blog Posts

I encountered two major hurdles during the development process:

  1. Lack of Streaming Responses: Initially, I found myself at a crossroads with no streaming responses. This obstacle proved more daunting than anticipated, as the available documentation offered little clarity or guidance.

  2. Generating Longer Blog Posts: The second challenge was to coax OpenAI into delivering more extended blog posts, a feature crucial to the essence of my project.

The Search for Solutions

Determined to overcome these challenges, I plunged into an exhaustive search for answers. This phase was marked by trials and a steep learning curve, particularly with the integration of Firebase for storage, which turned out to be quite intricate.

A Breakthrough with Vercel

Then came a pivotal moment in my quest - discovering a Vercel template for a Chat GPT clone. It was a eureka moment! However, this template was primarily designed for deployment on Vercel and Netlify, with Cloudflare deployment presenting a unique set of complexities.

The Cloudflare Conundrum

The primary challenge with Cloudflare was its requirement for Server-Sent Events (SSE), necessitating the EDGE runtime. Implementing this on Cloudflare was anything but straightforward.

Triumph Over Trials

After days of relentless effort and tinkering, I achieved what seemed impossible - deploying the AI Chatbot on Cloudflare. The process involved:

The Outcome

At present I have it deployed to cloudflare, but I’m seeing compatability issues… I’ll get there in time.

Why Does This Matter?

You might wonder why this endeavor matters. The answer lies in the broader context of my goals. Vercel, while an excellent platform, imposes charges for pro hosting, essential for monetizing any product. My ultimate aim is to reverse-engineer this Chatbot template and integrate it into my AI tools SaaS - another exciting project.

Deploying on Cloudflare not only saves costs but also opens the door to using Argo routing, which can significantly enhance Time to First Byte (TTFB). For more insights into hosting speeds, check out my comparison at sureshkhirwadkar.dev/posts/comparing-hosting-speeds.

In conclusion, this journey has been a blend of challenges, learning, and eventual triumph. It underscores the importance of perseverance in the tech world and opens up new avenues for innovation in AI and web development.

Read Count: 0


⬅️ Previous Post

Choosing the right image format for web elements: SVG, PNG or ICO?

Next Post ➡️

DNS vs hosting: What's the difference between them?