Update: Windsurf moves to daily/weekly credits pro/max plans - intros Glm 4.7/ Glm5

New Controversial Windsurf Pricing Model

The pricing model of windsurf this week has caused a stir… we’re all sitting with baited breath to see how this pans out, hopefully in our favour.

So now there is Pro ($20) and Max($200) (as well as team plans)… see Pricing | Windsurf for feature breakdown at bottom of page

Here is an estimate of the number of messages that you may expect to see:

Pro and Teams Max
Premium Plus (Opus 4.6, GPT-5.4, GPT-5.3-Codex) 7-27 messages / day 42-170 messages / day
Premium (e.g. Sonnet 4.6, GPT-5.2, Gemini Pro) 8-101 messages / day 47-631 messages / day
Lightweight (e.g. Haiku, Flash) 47-190 messages / day 291-1,190 messages / day

The usage limits assume a daily window. An additional weekly limit applies. The estimates are based on current usage patterns.

AI and BDD

It’s actually super hard for AI to work in BDD mode, and create specs BEFORE creating code, as it has to do the reasoning at the same time, and you may as well code out some of it for free while you are there, so that’s been a learning experience also working that balance out.

I have a daily and weekly quota and can buy extra credits if I go over, not sure what i think about that, I would have preferred a weekly/monthly quota, as some days I use it some days I dont, so will now need to create a bot that loads up a workflow of tasks, with testing to see how we go.

I’m a pretty heavy user at the moment as I’m rearchitecting a lot of Sql code to 2025 and updating my components to the next gen so been using claude (sonnet/opus) a LOT but the over-engineering causes you to put SOOOO many guiderails around it.

GLM 4.7 and GLM5 in beta on windsurf

So now that Glm 4.5 and 5 are in beta on windsurf and dirt cheap to use, could be good to see what I can get out of them this week.

Will put both in the ‘arena’ (another windsurf feature to compare two ai’s doing the same task to see the quality of output) to compare their output a few times also … so it’s going to be an exiting week as GLM5 is meant to be killer!

Winsurf Blog on new pricing model

Introducing our new Windsurf pricing plans


What’s changing:

  • No more credits. Your Free, Pro, or Teams plan includes a usage allowance that refreshes automatically on a daily and weekly basis. For the majority of users, this quota will be enough to fully cover all agent usage.
  • For paid plans, if you go beyond your included usage, you can purchase extra usage which will be consumed at API pricing.

The CASCADE and PLAN modes are incredibkle on this IDE, makes copilot look like a toy. (winsurf is built on Monaco of course like vscode so you can add in cfml extensions also)

Also the workflows, skills, rules and memories features are a lifechanger - moved totally away from instruction files now. I’ll post again about that once a get a full handle on how to configure the combo best.

And with Glm 4.7 at 0.25 credits (INSANE) and 5.0 at 1.25 credits thats insane value and abut 12 times cheaper than Claude atm…

Looking to self host the GLM 4.7 32B soon locally, that will be incredible…

ATM am using the SWE 1.5 for free on Windsurf also for everyday tasks, but it’s VERY simple,


If you want to check it out and get some freebies use this link : Get $10 in extra credits for Pro

This is getting close to being off topic, this is the Lucee forum yeah?

Tell us about how you’re actually using AI with Lucee, I’m super curious

I get what youre saying - but not everything in Lucee is just lucee - half of everyone’s dev is db also and javascript - so we can’t be too purist about things, that causes lag in our community - which in the past has held sometimes decades behind of other tech in terms of techniques and application., but also I’m not sure how talking about an IDE in the tools section of a developer forum is off topic.

I’ll try to put MORE useful info SPECIFICALLY for lucee in future. I just wanted to get this update info out there as for many small devs THIS IS A GAME-CHANGER!

Especially at the moment when smaller shops are strapped for cash, and AI IDE’s with the right setup can provide the much needed help they might not be able to afford.

I really think now with GLM 5 that within a month I’ll be able to write any module on the planet in good quality cfscript code in minutes by just pointing the ai at industry leading projects and a set of rules, workflows and memories in Windsurf.

That said I read EVERY line of code AI ever writes for me. I am meticulous in how code in my frameworks is written, and use AI to find industry standards, for security, testing and application that I can add to my ecosystem daily that I would never had know about an top of the 30 years of experience in many industries. You cant know it all, and AI is not always going to provide the solutions to everything, but right now in some areas these tools are killer.

I have a friend who writes the neomjs javacript framework ( an amazing cutting edge JavaScript framework that is nextgen ) and it’s built and maintained by devs using AI also, and he is able to crush hundreds of tickets a day with automated MCP server for his framework. I’d love to introduce you two as I think lucee could benefit in some way from this approach, especially with a small team, no doubt you already have your workflows, I wonder if his methods or Windsurf’s approach could help Lucee advance faster…

Anyway will post more detail about what I"m doing. But the whole point of telling people about this is because i have seen a 400% increase in output from moving from vscode to windsurf. I still code by hand and check every line along with AI doing boilerplate and deep thinking (moving from Claude to GLM), but like commandbox cli for coldbox generating boilerplate and sticking to standards, i can stand new modules and integrations in minutes, rather than days.

That is a HUGE competitive advantage, As winsurf has changed dramatically in the last month, I have to revisit my setup completely then will post something.

I need a Lucee MCP server right now, so short of baking one myself, for now I’ll just have to live with the setup I am revisiting. Has anyone baked a Luce 7.x MCP server? Would love to talk to them.

1 Like