That's an interesting idea to generate javascript and execute it client side rather than server side. I'm sure that saves a ton of money for Anthropic not by not having to spin up a server for each execution.
I've been trying to figure out the right pattern for running untrusted JavaScript code in a browser sandbox that's controlled by a page for a while now, looks like Anthropic have figured that out. Hoping someone can reverse engineer exactly how they are doing this - their JavaScript code is too obfuscated for me to dig out the tricks, sadly.
Funnily enough, I test code generation both on unpaid Claude and ChatGPT.
When working with Python, I've found Sonnet (pre 3.5) to be quite superior to ChatGPT (mostly 4, sometimes 3.5) with regards to verbosity, structure and prompt / instruct comprehension.
I've switched to a JavaScript project two weeks ago and the tables have turned.
Sonnet 3.5 is much more verbose and I need to make corrections a few times, whereas ChatGPTs output is shorter and on point.
I'll closely follow if this improves if Claude are focussing on JS themselves.
duckdb-wasm[0] would be a good addition here. We use it in Definite[1] and I can't say enough good things about duckdb in general.
JavaScript is the perfect language for this. I can't wait for a sandboxed coding environment to totally set AI loose.
NGL I was impressed when I asked Claude how to do some fancy UI stuff and it just spat out some working react. A few hours later and I'd saved £500 I was going to spend on a designer.
This is a great step, but to me not very useful until the move out of context. Still I'm high on anthropic and happy gen ai didn't turn into a winner-take-all market like everyone predicted in 2021.
It will work for any generic data, like a blog post. You can ask it to visualize the 'key concepts'.
The custom instructions to the model say:
"Please note that this is similar but not identical to the antArtifact syntax which is used for Artifacts; sorry for the ambiguity."
They seem to be apologizing to the model in the system prompt?? This is so intriguing