Q: Can you list the upcoming features for CodeMate 3.0?
Hello! I am on Tier 2 and was thinking about upgrading to the maximum tier here before your campaign ends in few hours. But based on many negative reviews, ESPECIALLY for the heavy token consumptions, can I know what new features and improvements will come with the new version 3.0 you mentioned? and more importantly: will all these updates be included for AppSumo LTD holders? Thanks!
I've lost my premium status on your site since the update or something... I've sent screenshots several times with the invoice in which you can see the account or the email that should actually have premium... I think the first time I pointed this out was in October 2024 or something...
Tokens in an LLM are calculated based on three factors: 1. Input: The query provided by the user. 2. Context: Additional information like files, searches, or knowledge bases attached to the query. 3. Output: The model's final response. Simple queries with no added context consume fewer tokens. When context is attached, the system processes this data—searching, retrieving, and...
Q: Are you coming back to Appsumo anytime soon?
missed it, are you planning to come back to appsumo?
Share CodeMate
Q: Can you list the upcoming features for CodeMate 3.0?
Hello!
I am on Tier 2 and was thinking about upgrading to the maximum tier here before your campaign ends in few hours. But based on many negative reviews, ESPECIALLY for the heavy token consumptions, can I know what new features and improvements will come with the new version 3.0 you mentioned? and more importantly: will all these updates be included for AppSumo LTD holders? Thanks!
Share CodeMate
Q: maybe i will get an answer here...
I've lost my premium status on your site since the update or something... I've sent screenshots several times with the invoice in which you can see the account or the email that should actually have premium... I think the first time I pointed this out was in October 2024 or something...
Share CodeMate
hmmm...
Q: How does the token spend work?
Is 1 prompt = 1 token? Can you give some context to how much is spent on a prompt?
Ayush_CodeMate
Jan 7, 2025A: Hi there
Tokens in an LLM are calculated based on three factors:
1. Input: The query provided by the user.
2. Context: Additional information like files, searches, or knowledge bases attached to the query.
3. Output: The model's final response.
Simple queries with no added context consume fewer tokens. When context is attached, the system processes this data—searching, retrieving, and...
Share CodeMate
Q: Will you support image upload?
I tried the chat but it doesn't seem to support image upload which might be useful for generating code from design.
Ayush_CodeMate
Jan 2, 2025A: Hi there
Yes, we have an option to support image upload in our roadmap ahead in CodeMate Agent.
Share CodeMate