How to write your Prompt to extract expected output
19-10-2024
Prompt Engineering is experiencing remarkable growth. While it’s widely acknowledged that this field isn’t a panacea and won’t fulfill extravagant promises of earning crores of rupees, as touted by some self-proclaimed social media influencers, it can significantly contribute to personal and professional development.
Generative AI and prompt-based machine learning projects are gaining traction due to their rapid project delivery and exceptional accuracy, particularly in Natural Language Programming-based initiatives. Recently, I delved into Document Classification and Entity Extraction projects, which could also be termed as Retrieval Augmented Generation. These projects, implemented with Prompt Engineering, were developed swiftly and successfully addressed complex business challenges.
Prompt Engineering:
What is prompt? Prompt is input or question provided to the system mostly to the Large Language Model, so the required output could be extracted. Ex.
Write a Python function to calculate the factorial of a given number.
Write a lively 100-word poem about football for 6-year-olds.
Analyze the sentiment of the following sentence: ‘I really enjoyed the movie.’
What is Prompt Engineering? Prompt engineering refers to the process of crafting or designing effective prompts to interact with language models like GPT (Generative Pre-trained Transformer).
There are lot of points which you should consider while creating the your prompt. Based on my experience, here is list of point you should consider,
Be Clear and Precise with you prompt: You need to be very clear what are you are expecting from LLM with your prompt. LLM cant read your mind, but it can read prompt. So try to add all important points in prompt. Ex. if you want LLM to classify document with Negative and Positive, then mention that expected answers are only Negative and Positive. But being clear doesn’t mean, you add unnecessary information. Unnecessary information could confuse LLM, and give unwanted answers, in some cases it can also lead to hallucination. So be precise with prompt, what is expected answer mention it precisely.
Use different shot prompt: Business problem could be very complex, so don’t hesitate to use one-shot or multi-shot prompt. Don’t just rely on zero shot prompt. Giving one or more example can help LLM to understand the more context about problem.
Divide and conquer: If you have multiple complex task in one prompt (like Classifying, and then extracting information), think about dividing the prompt in multiple simple prompts. LLM can perform better with simple task rather than complex.
Reiterate the prompt: Writing prompt is not definitely one time task writing best prompt, but rather evolving process, where you improve on the prompt slowly and steadily which might take time couple of days to couple of weeks, depend on complexity of your business problem.
Specify the output format: If you know, what is output expected format, be it JSON, table, or one word answer. Don’t hesitate to ask LLM to give answer in that format. It could save unnecessary reformatting of your output data.
Specify the output word counts: It can be helpful to specify exact amount of output words count.