Hacker Newsnew | past | comments | ask | show | jobs | submit | sdspurrier's commentslogin

cto.new creates a new cto chat session from the homepage


> Navigation to a .new domain must bring a user directly into the action generation or online creation flow. Navigation or redirection to a homepage or landing page that requires the user to take additional steps or clicks to initiate action or creation will not be deemed to comply with this policy.

https://www.registry.google/policies/registration/new/

I’m not sure using it as their homepage that has a text box on it is really what they had in mind. It’s a page where you can take steps to create a chat session, it isn’t a new chat session in itself. This is like if facebook.new brought you to your main feed, since there’s a “What’s on your mind?” text box where you can create a post. Compare that with something like docs.new which will automatically create a document and open it for editing with zero steps.


60% of the time, it works every time


Depends on your set of tasks but we use Engine for the bottom ~50% of issues by complexity. We have a pretty good swe-bench score from a while back but it's got much better since!

We have also focused on workflow integrations so you can assign issues from Linear, Jira, Trello etc which makes it more useful for teams.


Seems to me that integrations will be the most important component of tools like this. As an engineer I get my context from video calls with customers and other engineers, slack messages, emails, docs online, using the product myself, etc. So an auto-engineer should do the same.


We often find that some models perform better on certain types of repo. For example Claude 3.5/7 is typically much better at frontends. That's why we let you switch up the model for each repo.


The landing page? We are working on it! I posted a link directly to the app earlier but it got flagged from top of the front page :(


We built this framework to manage increasingly complex prompts and tool calls for LLM conversations with hot swappable models. We implement this framework for our AI engineering platform at enginelabs.ai.

It allows us to guardrail and extend LLMs for different software stacks with varying degrees of restriction in a relatively clean and manageable way.

We're interested to see if this framework is useful in other applications or for custom software development configurations.


There are lots of cool AI app builders out there, but the backend often seems to be neglected.

I thought it would be cool to focus on a conversational backend builder. There's a frontend tool to try too, but it's very alpha for now.

Stack:

  - Fastify Node.js app on Fly.io  
  - libSQL database on Turso  
  - managed OAuth  
  - Vite + React app on Fly.io
Hope you enjoy giving it a try. Reach out if you hit any limits.

I'd be happy to have a discussion here.


Hi! I helped make this. We're playing around using LLM platforms as an interface to external services. In this case, a managed Postgres database and BaaS platform.

If you don't have ChatGPT Plus you can try out similar functionality on our own web app https://app.backengine.dev/sign-up-ac0eba6d-2cc3-4873-9b76-9...


Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: