Author's): Gabriela
Originally published in Towards Artificial Intelligence.
The first step: when Vibe coding becomes magic
When I first started coding vibration Genie-HelloI really thought it would just help me move faster. I typed a few sentences describing what I wanted, hit Enter, and saw a functional product appear on the screen. For a brief moment I thought: so this is what 0 → 1 looks like in the era of artificial intelligence.
But magic was just the beginning. The real work – the chaotic, unpredictable and deeply educational work – came right after.
Where the challenges start to reveal themselves
1. Vibe encoding is remarkable at getting to the top 80%. The remaining 20%, however, has every trap you can imagine. Whenever I asked the model to “adjust a small detail”, something much bigger changed. It's like asking someone to tighten a screw in your kitchen, only to discover the next morning that they've completely transformed your house and you can't even leave your bedroom.
2. And then the long string trap appeared. Keeping everything in one conversation seemed effective at first, but after a few dozen rounds the consistency of the model deteriorated. Instructions were misread, restrictions were forgotten, hallucinations appeared in unexpected places. Fixing the problems in this drifting thread only made it drift even more.
3. Another mistake: I delayed Git. When you're building yourself, it's easy to assume that version control can wait – until you can. I spent hours debugging issues that were immediately reversed by a single new prompt. Without Git, vibrational coding becomes a cycle of create, destroy, and reluctantly accept that AI can undo your work as quickly as it creates it.
But none of these challenges can be compared to what came next.
4. Latency wasn't an issue in Genie-Hi's early demo phase. At the time, the product was much simpler, but as Genie-Hi became a more viable product – with multiple flows, richer logic, and interconnected user-facing features – performance signals began to emerge. The application began to run slower, and latency became the earliest indicator that the system had reached a threshold of complexity.
Using AI tools, I started investigating the slowdown patterns and gradually discovered what was happening. As more features were layered on, small architectural decisions accumulated, causing noticeable friction:
- some API calls were triggered unintentionally in series instead of parallel
- some React components were re-rendering more than necessary
- the front package grew large enough to do this block the main thread
- ……
However, this is not unusual. Anyone can get away with demoing; the product cannot. Although my background is in data science, I learned systems design concepts out of necessity. The delay was a message that told me that engineering fundamentals became necessary once the product moved beyond the demonstration phase.
This realization changed the way I thought about creating Genie-Hi or other vibration encoding apps: It wasn't just about adding features – it was about ensuring the system underneath would support them.
What I learned: Practices that actually work
Eventually, after enough trial, error, and random UI transformations, I developed a set of survival techniques that are now essential to how I vibrate code.
1. Keep changes atomic and aggressive in scope.
“Atomic” means painfully specific. Generative models don't understand locality, so if you give them space, they will rewrite the entire neighborhood.
Now I always specify:
- exact block – I only paste the appropriate function, never the entire file
- what is allowed – e.g. “You can update the file
handleSubmitonly function.” - what is prohibited — “DO NOT modify any other component or file.” (This one sentence prevents 70% of chaos.)
Necessary? Absolutely. This is the only way to stop Genie-Hi from turning into a different app every time I blink.
2. Don't let one model do it all. Treat AI tools like a distributed team.
I used to think that one powerful model could handle everything. Now I know better. Each tool has its own specialization, and development becomes much more stable if you respect this.
Here is the distributed “AI organizational chart” for Genie-Hi:

- ChatGPT → my thought partner
- AI Studio → my programmers
- Twins → my UI crew
- Co-pilot → Qa section
- NotebookLM → my marketing and secretarial team
- and more to discover
When I divided up the responsibilities this way, the chaos decreased dramatically.
It became (ironically) the modular engineering process we all dreamed of.
3. Use multilingual hints as a precision tool.
Who can imagine that bilingualism has its power in encoding vibrations? Surprisingly, switching between English and Chinese made the development much more accurate. When something doesn't make it into English, Chinese immediately disambiguates it – and vice versa. Multi-language prompting has become my strategy for debugging human-AI communication.
4. Build your E2E experience as you go.
Vibe coding makes any vulnerability visible – immediately. Genie-Hi forced me to learn in the real world where:
- UI/UX flow is just the beginning of a product – real UX comes from actually using and feeling the product
- Feature requests don't always represent progress – they can reveal structural debt.
- Architecture becomes a limitation of the product, whether you like it or not.
- Debugging is not a detour; This Is Work.
- A prototyping mindset and an engineering mindset need to coexist, but not at the same time.
- E2E thinking means combining vertical knowledge with horizontal flow.
None of this came from textbooks. Vibe coding didn't replace engineering – it made engineering irresistible.
Where is Genie-Hi now
Somewhere in the middle of all this learning, frustration, rebuilding and redesigning, Genie-Hi quietly leveled up. We refined the flows, fixed the architecture, reduced latency, simplified the UI, and rebuilt sections that weren't scalable.
The new version isn't ready for full reveal yet – but it feels alive, intentional, and much closer to the product I originally envisioned.
Consider this a soft teaser:
something exciting is coming and we can't wait to share it soon.
Published via Towards AI









:quality(70)/cloudfront-eu-central-1.images.arcpublishing.com/dlnews/KEHRDUOLFRGTPNISJYT6TU27BM.png?w=100&resize=100,70&ssl=1)







