7 Comments

you wrote a bunch of steps and made a lot of inside jokes along the way, but where is the result text generated by the fine-tuned LLM? was it accurate enough? seems like that is the most important part?

Expand full comment

Maybe I am that LLM ... follow up article coming soon.

Expand full comment

so what was the result of the example?

Expand full comment

It's me. I replaced myself with an LLM.

Expand full comment

I love your writing style.

I'm also curious about the results you generated? Did the model catch your voice?

Expand full comment

haha. Thanks, i should write the follow up article on that shouldn't I! Stop keeping everyone in suspense.

Expand full comment

Wow, fantastic read! Really enjoyed the journey you took us on in fine-tuning an LLM with your own blog posts. It’s eye-opening to see how much of the process relies on robust Data Engineering skills. Your points resonate a lot with what we discussed in our Data Engineering CV Blog:

https://dlthub.com/blog/data-engineering-cv

We often mention how critical it is for aspiring data engineers to demonstrate a broad skill set, and your experience clearly highlights that—from data munging to setting up cloud resources for training models.

Interestingly, we at dltHub have also tackled some challenges in the orchestration space, which seems like it would perfectly complement the steps you outlined for automating the environment setup and data preparation. It might be something worth exploring with your workflow to save more time (more mountain living!)

Looking forward to seeing the results generated by your fine-tuned LLM. Any insights on its effectiveness? Keep up the great work! 🚀

Best,

Aman Gupta

DLT Team

Expand full comment