Explore Vibe Coding with the Latest Visual Studio Preview

Exploring Vibe Coding with the Latest Visual Studio Preview
Microsoft’s newest preview version of Visual Studio introduces an innovative approach known as “vibe coding.” This method allows developers to leverage GitHub Copilot’s AI capabilities to write code based on verbal or written prompts.
What is Vibe Coding?
The concept of vibe coding recently emerged in Microsoft documentation, specifically from a post by the Visual Studio Code team. It represents a significant shift in software development, enabling developers to engage in programming tasks more intuitively and creatively. The term was also highlighted in successful demonstrations within the code editor, showcasing its practical applications.
Demonstrations of Vibe Coding
Last week, Amy Nguyen from Microsoft showcased vibe coding during a presentation titled “Build an App from Scratch with GitHub Copilot & Visual Studio.” Using Visual Studio 2022 version 17.14 Preview 2, she began with a Blazor Web App template. By simply prompting Copilot Edits to swap out the template’s counter page for a to-do list, she demonstrated the potential for natural language programming, supplemented by Copilot’s AI vision capabilities.
In a similar exercise, I created a mock-up image of a to-do app using ChatGPT. I then input that image into Copilot Edits, requesting it to generate the associated code for the application. Notably, the advanced vision features were available in version 17.14 Preview 2, while Copilot Edits was functional from version 17.13 onwards.
The outcome was promising, resulting in a functional to-do list application that allowed users to create, edit, and delete tasks.
Nguyen’s prompt was straightforward: “Can you update the counter page into an interactive to-do app set up like this?” Her interaction with Copilot Edits illustrated its adaptability in processing both detailed and vague instructions, while keeping developers informed about its actions and generating plans for the coding tasks.
Challenges Encountered with More Complex Tasks
After the success of the to-do app, I opted to tackle a more complex project by replacing the weather page of the Blazor Web App with an entirely new web app. I used an image that VS Code’s Copilot had successfully transformed into a web application in the past, despite its lower resolution.
Alongside this image, I uploaded the Visual Studio Magazine logo as an anchor for the new web app. However, this is where difficulties arose.
I found myself spending extensive time troubleshooting errors, as Copilot Edits struggled to complete the task and correct mistakes, often creating even more issues. For example, I instructed Copilot to place the VSM logo image in a new “images” folder under “wwwroot.” While the application confirmed it had done so, it turned out to be an illusion, as it couldn’t actually create new folders. Copilot acknowledged this limitation, stating that while it could suggest instructions for creating folders, the actual task had to be done manually.
This was one of the simpler obstacles I faced. Rather than continuing to fix multiple errors, I decided to ask ChatGPT to generate the initial code for the webpage and manually replace the flawed page code with this new code, which proved effective in the end.
While this process could be labeled as vibe coding, it certainly introduced an array of challenges that tested the limits of AI-assisted development.