Colin
@colin
Prompt engineering is more challenging than expected. Learned this by building @paragraph. - token length too large so need to preprocess post before summarization - chatGPT does a bad job at estimating characters of it's response so we often hit FC char limits (need retry logic) - often encounters hallucinations
3 replies
0 recast
0 reaction
Giuliano Giacaglia π²
@giu
Iβve learned the same about token length. Iβm building netbox and have been working on a prototype. Token length is def an issue. Only when building, you realize the bottlenecks of new tech
1 reply
0 recast
0 reaction
Giuliano Giacaglia π²
@giu
Btw are you using OpenAI API or just using an open source model? The problem with OpenAI API that I see is that there is no way to control the limit for spending as far as I can tell
1 reply
0 recast
0 reaction