Sign In to Follow Application
View All Documents & Correspondence

Sentiment Driven Emotional Context Adaption And Empathetic Response Generation In Ai Generated Text

Abstract: The present disclosure provides a system (100) for enhancing artificial intelligence (AI)-generated text with sentiment-driven emotional context adaptation and empathetic response generation. The system features a data reception module (102) for acquiring input text from users, and a sentiment analysis module (104) that determines and categorizes the emotional tone of the text, including complex emotions like happiness, sadness, or anger. A response generation module (106) then creates and adjusts AI-generated responses to mirror the identified emotional tones. An emotional cues integration module (108) enriches responses with emoticons or empathetic wording, while a feedback module (110) refines the performance of the system, maintaining contextually relevant and emotionally resonant communication. Drawings /FIG. 1 / FIG. 2 /Fig. 3 / Fig. 4

Get Free WhatsApp Updates!
Notices, Deadlines & Correspondence

Patent Information

Application #
Filing Date
26 April 2024
Publication Number
23/2024
Publication Type
INA
Invention Field
COMPUTER SCIENCE
Status
Email
Parent Application

Applicants

MARWADI UNIVERSITY
MARWADI UNIVERSITY, RAJKOT- MORBI HIGHWAY, AT GAURIDAD, RAJKOT – 360003, GUJARAT, INDIA
DR. ANJALI DIWAN
MARWADI UNIVERSITY, RAJKOT- MORBI HIGHWAY, AT GAURIDAD, RAJKOT – 360003, GUJARAT, INDIA
MS. RESHMA SUNIL
MARWADI UNIVERSITY, RAJKOT- MORBI HIGHWAY, AT GAURIDAD, RAJKOT – 360003, GUJARAT, INDIA
MS. PARITA MER
MARWADI UNIVERSITY, RAJKOT- MORBI HIGHWAY, AT GAURIDAD, RAJKOT – 360003, GUJARAT, INDIA

Inventors

1. DR. ANJALI DIWAN
MARWADI UNIVERSITY, RAJKOT- MORBI HIGHWAY, AT GAURIDAD, RAJKOT – 360003, GUJARAT, INDIA
2. MS. RESHMA SUNIL
MARWADI UNIVERSITY, RAJKOT- MORBI HIGHWAY, AT GAURIDAD, RAJKOT – 360003, GUJARAT, INDIA
3. MS. PARITA MER
MARWADI UNIVERSITY, RAJKOT- MORBI HIGHWAY, AT GAURIDAD, RAJKOT – 360003, GUJARAT, INDIA

Specification

Description:Field of the Invention

The present disclosure relates to artificial intelligence systems, specifically a system for enhancing AI-generated text with adaptive emotional context and empathetic responses based on user-provided input text.
Background
The background description includes information that may be useful in understanding the present invention. It is not an admission that any of the information provided herein is prior art or relevant to the presently claimed invention, or that any publication specifically or implicitly referenced is prior art.
In the field of artificial intelligence (AI), a paramount challenge persists in the generation of text that accurately conveys emotional context. Traditional AI systems have demonstrated a marked deficiency in understanding and expressing emotions. As a consequence, responses generated by such systems are often perceived as mechanical and devoid of empathy. The absence of emotional context in AI communications represents a significant limitation of existing technologies, impeding the realization of interactions that are perceived as natural by users.
The conventional approach to AI-generated text typically involves the generation of responses based solely on the informational content of user input without adequate consideration for the emotional subtext. Said approach frequently results in responses that, while factually correct, fail to address the emotional nuances conveyed by the user. Said shortcoming leads to interactions that users may find unsatisfactory due to the perceived insensitivity of the AI system, thereby diminishing the overall user experience.
In addressing the aforementioned deficiencies, prior art systems have attempted to incorporate basic emotional recognition capabilities. However, said systems have been limited in their ability to detect specific emotions with the required granularity and often lack the ability to adjust responses in a manner that authentically reflects the emotional states of happiness, sadness, or anger. As such, the responses generated by said systems do not adequately mirror the emotional tone of the input text, leading to a dissonance between the user expectations and the output.
Furthermore, user engagement is an important aspect of AI system design that has not been satisfactorily addressed by previous technologies. Engagement on an emotional level is essential for interactions that are perceived as meaningful and personal. Prior art systems, by generating responses that lack emotional resonance, have failed to establish a connection with users, often resulting in a user experience that is deemed impersonal and robotic.
The effectiveness of communication is another area where prior art systems exhibit significant limitations. Effective communication is not solely the transmission of information but also involves an understanding of and response to emotional cues. Prior systems have not effectively utilized emotional cues in user input, leading to a communication breakdown and the potential for misinterpretation of user intent. Said deficiency hinders the ability of said AI system to participate in a clear and meaningful dialogue, thus restricting the utility in applications where effective communication is significant.
Personalization of responses has been recognized as an important feature for enhancing user experience. Prior systems have lacked the capability to personalize responses based on the emotional content of the input text. As a result, interactions with AI systems have lacked the individualized approach that is key to creating a sense of understanding and consideration for the emotional state of the user.
Miscommunication is a prevalent issue that arises when the emotional context is not accurately captured or conveyed. Prior art systems have been inadequate in their consideration of emotional context, which can lead to miscommunication and misunderstandings between AI systems and users. Said miscommunication impairs the efficiency of the interaction and also affects the perceived reliability and intelligence of the AI system.

Hence, the drawbacks of prior art in the field of AI-generated text centre on the lack of emotional depth, inadequate engagement, ineffective communication, insufficient personalization, and a propensity for miscommunication. Prior art limitations underline the necessity for a system capable of adapting the emotional context and generating empathetic responses in AI-generated text to address the multifaceted challenges presented by the emotional dimensions of human-AI interaction.
Summary
The following presents a simplified summary of various aspects of this disclosure in order to provide a basic understanding of such aspects. This summary is not an extensive overview of all contemplated aspects, and is intended to neither identify key or critical elements nor delineate the scope of such aspects. Its purpose is to present some concepts of this disclosure in a simplified form as a prelude to the more detailed description that is presented later.
The following paragraphs provide additional support for the claims of the subject application.
The disclosure pertains to a system 100 for sentiment-driven emotional context adaptation and empathetic response generation in artificial intelligence (AI) generated text. Said system 100 includes a data reception module 102 for receiving input text from a user. A sentiment analysis module 104 is configured to determine the emotional tone of said input text, including happiness, sadness, or anger, and to categorize the determined emotional tone as positive, negative, or neutral.
A response generation module 106 is configured to produce an AI-generated response that reflects the determined emotional tone of said input text and to adjust said AI-generated response based on the determined emotional tone. An emotional cues integration module 108 is configured to incorporate emoticons or specific wording into said AI-generated response to convey empathy and understanding.
Additionally, a feedback module 110 is provided for receiving feedback from a user to maintain the contextual relevance and emotional resonance of said AI-generated response. Furthermore, said sentiment analysis module 104 comprises a natural language processing engine utilizing machine learning algorithms for emotional tone detection. Moreover, said response generation module 106 includes a dialogue management unit for tailoring the structure of said AI-generated response to emulate conversational patterns observed in human communication.
Furthermore, said emotional cues integration module 108 is configured to select emoticons or specific wording based on a cultural and linguistic database, accounting for regional and demographic variations in emotional expression. Said feedback module 110 comprises an adaptive learning component that updates response strategies based on user feedback patterns. Additionally, said sentiment analysis module 104 utilizes deep learning networks to discern underlying emotional subtleties in said input text that are not explicit.
Said response generation module 106 is further configured to apply syntactic and semantic rules of the language of said input text to maintain grammatical coherence in said AI-generated response. Moreover, said feedback module 110 includes a sentiment drift detection component that identifies shifts in user sentiment over the course of interactions to adjust said emotional tone. Said sentiment analysis module 104 is further configured to determine the intensity of the emotional tone and recognize an adjustment in the determined intensity of emotional tone or expression in said AI-generated response accordingly.
The present disclosure provides a method 200 for sentiment-driven emotional context adaptation in artificial intelligence (AI) generated text. The method includes several steps beginning with the receipt of input text from a user in a data reception module 102. Following said initial step, a sentiment analysis module 104 undertakes the determination of the emotional tone of said input text, which encompasses emotions such as happiness, sadness, or anger. Subsequently, said sentiment analysis module 104 categorizes the determined emotional tone as positive, negative, or neutral.
Upon the categorization of the emotional tone, a response generation module 106 produces an AI-generated response that reflects the determined emotional tone of said input text. Said production step is important for ensuring that the generated response aligns with the emotional state of the user. To further refine the response, said response generation module 106 adjusts said AI-generated response based on the determined emotional tone. Said adjustment ensures that the response is not only accurate in reflecting the emotional tone but also appropriate in the delivery and content.
Additionally, an emotional cues integration module 108 incorporates emoticons or specific wording into said AI-generated response. Said incorporation is aimed at conveying empathy and understanding, which are essential components of effective communication, especially in contexts requiring emotional intelligence. Finally, a feedback module 110 receives feedback from a user. Said feedback is instrumental in maintaining the contextual relevance and emotional resonance of said AI-generated response. By continually adapting to user feedback, the system ensures that the responses remain not only contextually appropriate but also emotionally resonant, thereby enhancing the overall user experience.
Through said method, the disclosure aims to enable the creation of AI-generated texts that are not only contextually accurate but also emotionally intelligent. By adapting to the emotional context of the user input, the system aims to provide responses that are empathetic and understanding, thereby fostering a more natural and engaging interaction between the AI and the user.

Brief Description of the Drawings

The features and advantages of the present disclosure would be more clearly understood from the following description taken in conjunction with the accompanying drawings in which:
FIG. 1 illustrates a system for sentiment-driven emotional context adaptation and empathetic response generation in artificial intelligence (AI) generated text, in accordance with the embodiments of the present disclosure.
FIG. 2 illustrates a method for sentiment-driven emotional context adaptation in artificial intelligence (AI) generated text, in accordance with the embodiments of the present disclosure.
FIG. 3 illustrates a working decision flow diagram (DFD) of the method for enhancing AI-generated text with emotional context and empathy, in accordance with the embodiments of the present disclosure.
FIG. 4 illustrates a flowchart of the method for processing and generating text with the emotional intelligence.
Detailed Description
In the following detailed description of the invention, reference is made to the accompanying drawings that form a part hereof, and in which is shown, by way of illustration, specific embodiments in which the invention may be practiced. In the drawings, like numerals describe substantially similar components throughout the several views. These embodiments are described in sufficient detail to claim those skilled in the art to practice the invention. Other embodiments may be utilized and structural, logical, and electrical changes may be made without departing from the scope of the present invention. The following detailed description is, therefore, not to be taken in a limiting sense, and the scope of the present invention is defined only by the appended claims and equivalents thereof.
The use of the terms “a” and “an” and “the” and “at least one” and similar referents in the context of describing the invention (especially in the context of the following claims) are to be construed to cover both the singular and the plural, unless otherwise indicated herein or clearly contradicted by context. The use of the term “at least one” followed by a list of one or more items (for example, “at least one of A and B”) is to be construed to mean one item selected from the listed items (A or B) or any combination of two or more of the listed items (A and B), unless otherwise indicated herein or clearly contradicted by context. The terms “comprising,” “having,” “including,” and “containing” are to be construed as open-ended terms (i.e., meaning “including, but not limited to,”) unless otherwise noted. Recitation of ranges of values herein are merely intended to serve as a shorthand method of referring individually to each separate value falling within the range, unless otherwise indicated herein, and each separate value is incorporated into the specification as if it were individually recited herein. All methods described herein can be performed in any suitable order unless otherwise indicated herein or otherwise clearly contradicted by context. The use of any and all examples, or exemplary language (e.g., “such as”) provided herein, is intended merely to better illuminate the invention and does not pose a limitation on the scope of the invention unless otherwise claimed. No language in the specification should be construed as indicating any non-claimed element as essential to the practice of the invention.
Pursuant to the "Detailed Description" section herein, whenever an element is explicitly associated with a specific numeral for the first time, such association shall be deemed consistent and applicable throughout the entirety of the "Detailed Description" section, unless otherwise expressly stated or contradicted by the context.
The present disclosure relates to a system 100 for sentiment-driven emotional context adaptation and empathetic response generation in artificial intelligence (AI) generated text. Said system 100 encompasses several modules arranged to interact seamlessly to provide a user-centric, empathetic communication experience. Each module within the system 100 plays a pivotal role in processing user input, analyzing sentiment, generating appropriate responses, integrating emotional cues, and refining the performance of the system 100 through feedback. The detailed description of each component within the system 100 follows, including the technical effects that contribute significantly to the overall efficacy and nature of the system 100.
According to a pictorial illustration of FIG. 1, showcasing an architectural paradigm of the system 100 that can comprise functional elements, yet not limited to a data reception module 102, a sentiment analysis module 104, a response generation module 106, an emotional cues integration module 108 and a feedback module 110. A person ordinarily skilled in art would prefer those elements or components of the system 100, to be functionally or operationally coupled to/ with each other, in accordance with the embodiments of present disclosure.
In an embodiment, the data reception module 102 is tasked with receiving input text from a user. Said data reception module 102 serves as the initial point of contact between the user and the system 100, so that user input is accurately captured for further processing. The primary function of the data reception module 102 is to facilitate the seamless acquisition of input text, which is crucial for the subsequent analysis and response generation processes. By accurately capturing user input, the data reception module 102 lays the foundation for a responsive and user-centric AI architecture. The efficiency and reliability of said data reception module 102 directly affect the ability of the system 100 to provide timely and relevant AI-generated responses.
In an embodiment, the sentiment analysis module 104 is configured to perform one or more key functions such as, not restricted to determining the emotional tone of the input text and categorizing the determined emotional tone as positive, negative, or neutral. Said sentiment analysis module 104 employs advanced algorithms to analyze the input text for expressions of emotions such as happiness, sadness, or anger. By accurately identifying and categorizing the emotional tone of the input text, the sentiment analysis module 104 enables the system 100 to understand the emotional state of the user. Said understanding is important for generating AI responses that are not only relevant but also emotionally aligned with the expressed sentiment. The ability of the sentiment analysis module 104 to discern and categorize emotional tones contributes significantly to the overall empathetic responsiveness, enhancing the user experience by maintaining that responses are both contextually and emotionally appropriate.
In an embodiment, the response generation module 106 is responsible for producing an AI-generated response that reflects the determined emotional tone of the input text and adjusting the response based on the determined emotional tone. Said response generation module 106 utilizes the insights gained from the response generation module 106 to craft responses that are not only contextually relevant but also emotionally resonant. By adjusting the AI-generated response to mirror the emotional tone of the input text, the response generation module 106 facilitates that the responses are perceived as empathetic and understanding. The technical effect of said response generation module 106 lies in the ability to generate responses that foster a sense of connection and understanding between the system 100 and the user, thereby enhancing the quality of interaction.
In an embodiment, the emotional cues integration module 108 is configured to incorporate emoticons or specific wording into the AI-generated response to convey empathy and understanding. Said emotional cues integration module 108 enhances the emotional expressiveness of AI-generated responses by integrating emotive elements that resonate with the user on an emotional level. The inclusion of emoticons and emotionally charged wording enriches the communicative value of responses, making said responses more relatable and human-like. The emotional cues integration module 108 plays a crucial role in elevating the user experience by maintaining that responses not only address the content of user input but also acknowledge and reflect the emotional state.
In an embodiment, the feedback module 110 is arranged to receive feedback from a user to maintain the contextual relevance and emotional resonance of the AI-generated response. Said feedback module 110 enables the continuous improvement of the system 100 by leveraging user feedback to refine response generation and emotional cue integration processes. By analyzing user feedback, the system 100 can identify areas for enhancement and adapt the algorithms to better meet user expectations. The feedback module 110 can maintain that the system 100 remains attuned to user needs and preferences, thereby sustaining the relevance and effectiveness of AI-generated responses over time.
Referring to one or more preceding embodiments, the system 100 for sentiment-driven emotional context adaptation and empathetic response generation in AI-generated text represents an approach to enhancing AI communication capabilities. Through the synergistic operation of the constituent modules, the system 100 offers a highly responsive, emotionally attuned communication experience that reflects a deep understanding of user sentiment and emotional states.
In an embodiment, the sentiment analysis module 104 further comprises a natural language processing engine that utilizes machine learning algorithms for emotional tone detection. The inclusion of a natural language processing engine enhances the capability of the sentiment analysis module 104 to interpret and analyze the input text with a high degree of accuracy and nuance. Machine learning algorithms enable the sentiment analysis module 104 to learn from a vast array of text samples, improving the ability to detect a wide range of emotional tones. Said adaptive learning capability can facilitate that sentiment analysis module 104 remains effective across various linguistic contexts and evolves in response to new patterns of emotional expression. The technical effect of incorporating a natural language processing engine within the sentiment analysis module 104 lies in the enhanced ability to discern emotional tones with greater precision, thereby facilitating the generation of responses that are more closely aligned with the emotional state and expectations of the user.
In another embodiment, the response generation module 106 further comprises a dialogue management unit that tailors the structure of the AI-generated response to emulate conversational patterns observed in human communication. The dialogue management unit plays a key role in facilitating that AI-generated responses convey the intended emotional tone and also adhere to conversational norms and expectations. By emulating the structure and flow of human communication, the dialogue management unit enhances the naturalness and relatability of AI-generated responses. Said dialogue management unit contributes significantly to the ability of the system 100 to engage the user in meaningful and empathetic exchanges, fostering a sense of connection and understanding. The technical effect of said dialogue management unit is evident in the contribution to creating a more human-like, engaging, and contextually appropriate conversational experience for the user.
In a further embodiment, the emotional cues integration module 108 is configured to select emoticons or specific wording based on a cultural and linguistic database that accounts for regional and demographic variations in emotional expression. Said configuration allows the cues integration module 108 to tailor the selection of emoticons and wording to the cultural and linguistic nuances of the user, enhancing the relevance and effectiveness of the emotional cues integrated into AI-generated responses.
By acknowledging and adapting to regional and demographic variations in emotional expression, the emotional cues integration module 108 significantly improves the ability of the system 100 to convey empathy and understanding in a manner that resonates with the cultural and linguistic background of the user. The technical effect of said feature is the enhancement of the inclusivity and sensitivity of the system 100 to diverse emotional expressions, thereby broadening the applicability and appeal across different user groups.
In another embodiment, the feedback module 110 further comprises an adaptive learning component that updates response strategies based on user feedback patterns. Said component enables the system 100 to dynamically refine and adjust the response strategies to better align with user preferences and expectations. By analyzing patterns in user feedback, the adaptive learning component facilitates continuous improvement in the performance, so that AI-generated responses remain contextually relevant and emotionally resonant over time. The technical effect of the adaptive learning component is the contribution to the capacity of the system 100 for self-optimization, which plays a pivotal role in sustaining user engagement and satisfaction.
In a further embodiment, the sentiment analysis module 104 is configured to utilize deep learning networks to discern underlying emotional subtleties in the input text that are not explicit. The application of deep learning networks allows the sentiment analysis module 104 to capture and interpret subtle cues and nuances in the input text, enabling a more nuanced understanding of the emotional state of the user. Said capability is particularly valuable in cases where the emotional tone of the input text is complex or ambivalent. The technical effect of employing deep learning networks within the sentiment analysis module 104 is the enhancement of the sensitivity of the system 100 to emotional subtleties, thereby improving the accuracy and depth of emotional tone detection and response generation.
In another embodiment, the response generation module 106 is configured to apply syntactic and semantic rules of the language of the input text to maintain grammatical coherence in the AI-generated response. Said configuration facilitates that the AI-generated responses are not only emotionally aligned with the user input but also grammatically coherent and linguistically appropriate. By adhering to the syntactic and semantic rules of the language, the response generation module 106 enhances the clarity, readability, and professionalism of AI-generated responses. The technical effect of said feature lies in the contribution to the overall quality and effectiveness of communication between the system 100 and the user, reinforcing the reliability and credibility of the system 100.
In a further embodiment, the feedback module 110 includes a sentiment drift detection component that identifies shifts in user sentiment over the course of interactions to adjust the emotional tone. Said component allows the system 100 to detect and respond to changes in the emotional state of the user, maintaining that AI-generated responses remain aligned with the evolving sentiment of the user. The ability to detect and adapt to sentiment drift enhances the responsiveness and sensitivity to the emotional dynamics of the user, contributing to a more personalized and empathetic user experience. The technical effect of the sentiment drift detection component is the role in maintaining the contextual and emotional relevance of AI-generated responses, thereby enhancing user engagement and satisfaction.
In another embodiment, the sentiment analysis module 104 is configured to determine the intensity of the emotional tone and recognize an adjustment in the determined intensity of emotional tone or expression in the AI-generated response accordingly. Said configuration enables the sentiment analysis module 104 to identify the presence of an emotional tone and also gauge the intensity, allowing for a more precise and nuanced adaptation of the AI-generated response. By adjusting the intensity of the emotional tone or expression in the response to match that of the input text, the sentiment analysis module 104 facilitates that the responses are appropriately calibrated to the emotional state of the user. The technical effect of said feature is the enhancement of the ability to engage in emotionally congruent and contextually sensitive communication, thereby fostering a deeper sense of empathy and connection with the user.
Disclosed herein a method 200 for sentiment-driven emotional context adaptation in artificial intelligence (AI) generated text. The method 200 consisting of one or more steps performed to interpret user input, analyse the emotional content, generate appropriate responses, and refine the process based on user feedback. The method 200 facilitates that AI-generated text is contextually relevant but also emotionally resonant with the user, enhancing the interaction experience.
Referring to a diagrammatic depiction put forth in FIG. 2, representing a flow diagram of the method 200 that can comprise steps of, yet not restricted to, (at step 202) receiving input text from a user, (at step 204) determining the emotional tone of said input text, (at step 206) categorizing the determined emotional tone, (at step 208) producing an AI-generated response that reflects the determined emotional tone of said input text, (at step 210) adjusting said AI-generated response, (at step 212) incorporating emoticons or specific wording into said AI-generated response and (at step 214) receiving feedback from a user to maintain the contextual relevance and emotional resonance. Said steps of the method 200 can be performed or executed, collectively or selectively, randomly, or sequentially or in a combination thereof, in accordance with the embodiments of current disclosure.
In an embodiment, at step 202, input text is received from a user in a data reception module 102. Said initial step 202 is important in capturing the user input, serving as the foundation for all subsequent analysis and response generation. The accuracy and efficiency of the data reception module 102 in capturing user input are important in directly impacting the ability to provide relevant and timely AI-generated responses. The technical effect of said step 202 lies in the capacity to accurately capture and transmit user input to the sentiment analysis module 104, so that the analysis is based on precise and the user input data.
In an embodiment, at step 204, the emotional tone of the input text is determined by a sentiment analysis module 104. Said step 204 involves analyzing the input text to identify emotions such as happiness, sadness, or anger. By determining the emotional tone, the sentiment analysis module 104 enables the system 100 to gain insights into the emotional state, which is important for generating responses that are not only relevant but also empathetically aligned with the expressed sentiment of the user. The technical effect of said step 204 is the contribution to enhancing the ability of said system 100 to interpret and respond to the emotional cues accurately.
In an embodiment, at step 206, the determined emotional tone is categorized by the sentiment analysis module 104 as positive, negative, or neutral. Said categorization process is essential for guiding the response generation module 106 in crafting responses that are appropriately aligned with the emotional context of the user input. By categorizing the emotional tone, the sentiment analysis module 104 facilitates a more nuanced response strategy that considers the emotional valence of the user input. The technical effect of said categorization lies in the role in enabling more targeted and emotionally congruent AI-generated responses.
In an embodiment, at step 208, an AI-generated response that reflects the determined emotional tone of the input text is produced by the response generation module 106. Said step 208 is critical in translating the insights gained from the sentiment analysis into actual text that the user can perceive as understanding and empathetic. The ability of the response generation module 106 to produce responses that accurately reflect the emotional tone of the user enhances the perceived empathy and relevance of the interactions of said system 100. The technical effect of said step 208 is the improvement of user engagement through the provision of responses that are both contextually appropriate and emotionally resonant.
In an embodiment, at step 210, the AI-generated response is adjusted by the response generation module 106 based on the determined emotional tone. Said adjustment process maintains that the response is reflective of the emotional state of the user and also appropriately modulated to convey empathy and understanding. By fine-tuning the AI-generated response according to the emotional tone, the system 100 can better align the responses with the expectations and emotional needs of the user. The technical effect of said step 210 is the enhancement of the ability of said system 100 to engage in a more personalized and empathetic dialogue with the user.
In an embodiment, at step 212, emoticons or specific wording are incorporated into the AI-generated response by an emotional cues integration module 108 to convey empathy and understanding. Said incorporation of emotional cues is a key step in enriching the AI-generated text with elements that can enhance the emotional expressiveness and resonance. The selection and integration of appropriate emoticons and wording contribute to making the AI-generated responses more relatable and human-like. The technical effect of said step 212 is the significant contribution to improving the emotional depth and expressiveness of AI-generated responses, thereby enhancing the overall user experience.
In an embodiment, at step 214, feedback is received from a user by a feedback module 110 to maintain the contextual relevance and emotional resonance of the AI-generated response. Said final step 214 in the method 200 enables the system 100 to refine and adapt the response strategies based on direct user feedback, so that the AI-generated responses continue to meet user expectations and preferences over time. By incorporating user feedback into the response generation process, the system 100 can continually improve the accuracy, relevance, and emotional alignment. The technical effect of said step 214 is the facilitation of a dynamic, user-driven refinement process that enhances the effectiveness and user satisfaction of said system 100.
Referring to one or more preceding embodiments, thus disclosed herein a significant aspect in natural language processing (NLP) and artificial intelligence (AI), aimed at enriching AI-generated text with emotional context and empathy. The system 100 addresses the common shortfall in conventional AI systems, where the generated text often lacks the emotional depth necessary for effective communication and user interaction. By integrating the sentiment analysis and empathetic response generation techniques, the system 100 significantly enhances the quality of AI-generated text, making more relatable and engaging for the user.
Referring to one or more preceding embodiments, the multi-step method 200 that begins with the analysis of the emotional tone of input text using advanced NLP techniques. Said initial step categorizes the sentiment as positive, negative, or neutral, setting the stage for the adaptive generation of responses that mirror the emotional context of the input. The method 200 further refines AI-generated responses by incorporating specific words, phrases, and tones that convey empathy and understanding, significantly improving the emotional depth of the communication.
Referring to one or more preceding embodiments, the method 200 maintains that AI-generated text is contextually relevant and also emotionally resonant with the user, fostering a deeper connection between the system 100 and the user. The system 100 includes mechanisms for continuously refining the response strategies based on user feedback, thereby enhancing the ability to provide personalized and emotionally aligned responses over time.
Thus, the system 100 marks a considerable leap forward in making AI-generated text more emotionally intelligent, leading to more meaningful and effective interactions between artificial intelligence (AI) and the user. By focusing on the emotional tone of the input text and generating empathetic responses, the method 200 offers a tailored communication experience that significantly reduces miscommunication and increases user engagement.
FIG. 3 illustrates a working decision flow diagram (DFD) of the method 200 for enhancing AI-generated text with emotional context and empathy. The method 200 begins by receiving input text along with user preferences. Next, the text undergoes sentiment analysis to understand the emotional tone. Depending on whether the sentiment is identified as positive or negative, the tone of the response is adjusted accordingly. A response is then generated, and if an emotion is detected within the text, the response is further refined to align with that emotion. Emotional cues are added to the response to express empathy. The system 100 then generates an empathetic response and collects user feedback to improve future interactions. Said feedback is used to update the integrated modules, leading to the generation of an output that is both contextually and emotionally relevant, before the method 200 concludes.
FIG. 4 illustrates a flowchart of the method 200 for processing and generating text with emotional intelligence. The process begins with input processing, followed by sentiment analysis using the VADER (Valence Aware Dictionary and Sentiment Reasoner) tool. After determining sentiment, the system 100 adjusts the contextual tone and detects emotions within the text. The system 100 then expresses these emotions, potentially by generating emotional cues. An empathetic response is created based on this emotional understanding. The system 100 utilizes a feedback loop to refine the performance and integrate improvements. Finally, the refined output is generated by the system 100, concluding the process.
Example embodiments herein have been described above with reference to block diagrams and flowchart illustrations of methods and apparatuses. It will be understood that each block of the block diagrams and flowchart illustrations, and combinations of blocks in the block diagrams and flowchart illustrations, respectively, can be implemented by various means including hardware, software, firmware, and a combination thereof. For example, in one embodiment, each block of the block diagrams and flowchart illustrations, and combinations of blocks in the block diagrams and flowchart illustrations can be implemented by computer program instructions. These computer program instructions may be loaded onto a general purpose computer, special purpose computer, or other programmable data processing apparatus to produce a machine, such that the instructions which execute on the computer or other programmable data processing apparatus create means for implementing the functions specified in the flowchart block or blocks.
Throughout the present disclosure, the term ‘processing means’ or ‘microprocessor’ or ‘processor’ or ‘processors’ includes, but is not limited to, a general purpose processor (such as, for example, a complex instruction set computing (CISC) microprocessor, a reduced instruction set computing (RISC) microprocessor, a very long instruction word (VLIW) microprocessor, a microprocessor implementing other types of instruction sets, or a microprocessor implementing a combination of types of instruction sets) or a specialized processor (such as, for example, an application specific integrated circuit (ASIC), a field programmable gate array (FPGA), a digital signal processor (DSP), or a network processor).
The term “non-transitory storage device” or “storage” or “memory,” as used herein relates to a random access memory, read only memory and variants thereof, in which a computer can store data or software for any duration.
Operations in accordance with a variety of aspects of the disclosure is described above would not have to be performed in the precise order described. Rather, various steps can be handled in reverse order or simultaneously or not at all.
While several implementations have been described and illustrated herein, a variety of other means and/or structures for performing the function and/or obtaining the results and/or one or more of the advantages described herein may be utilized, and each of such variations and/or modifications is deemed to be within the scope of the implementations described herein. More generally, all parameters, dimensions, materials, and configurations described herein are meant to be exemplary and that the actual parameters, dimensions, materials, and/or configurations will depend upon the specific application or applications for which the teachings is/are used. Those skilled in the art will recognize, or be able to ascertain using no more than routine experimentation, many equivalents to the specific implementations described herein. It is, therefore, to be understood that the foregoing implementations are presented by way of example only and that, within the scope of the appended claims and equivalents thereto, implementations may be practiced otherwise than as specifically described and claimed. Implementations of the present disclosure are directed to each individual feature, system, article, material, kit, and/or method described herein. In addition, any combination of two or more such features, systems, articles, materials, kits, and/or methods, if such features, systems, articles, materials, kits, and/or methods are not mutually inconsistent, is included within the scope of the present disclosure.

Claims

I/We claims:

A system 100 for sentiment-driven emotional context adaptation and empathetic response generation in artificial intelligence (AI) generated text, the system 100 comprising:
a data reception module 102 receives input text from a user;
a sentiment analysis module 104 configured to:
determine the emotional tone of said input text, wherein said emotional tone including happiness, sadness, or anger; and
categorize the determined emotional tone as positive, negative, or neutral;
a response generation module 106 configured to:
produce an AI-generated response that reflects the determined emotional tone of said input text; and
adjust said AI-generated response based on the determined emotional tone;
an emotional cues integration module 108 configured to incorporate emoticons or specific wording into said AI-generated response to convey empathy and understanding; and
a feedback module 110 receives feedback from a user to maintain the contextual relevance and emotional resonance of said AI-generated response.
The system 100 of claim 1, wherein the sentiment analysis module 104 further comprises a natural language processing engine that utilizes machine learning algorithms for said emotional tone detection.
The system 100 of claim 1, wherein the response generation module 106 further comprises a dialogue management unit that tailors the structure of said AI-generated response to emulate conversational patterns observed in human communication.
The system 100 of claim 1, wherein the emotional cues integration module 108 is further configured to select emoticons or specific wording based on a cultural and linguistic database that accounts for regional and demographic variations in emotional expression.
The system 100 of claim 1, wherein the feedback module 110 further comprises an adaptive learning component that updates response strategies based on user feedback patterns.
The system 100 of claim 1, wherein the sentiment analysis module 104 is further configured to utilize deep learning networks to discern underlying emotional subtleties in said input text that are not explicit.
The system 100 of claim 1, wherein the response generation module 106 is further configured to apply syntactic and semantic rules of the language of said input text to maintain grammatical coherence in said AI-generated response.
The system 100 of claim 1, wherein the feedback module 110 further includes a sentiment drift detection component that identifies shifts in user sentiment over the course of interactions to adjust said emotional tone.
The system 100 of claim 1, wherein the sentiment analysis module 104 is further configured to:
determine the intensity of the emotional tone; and
recognize an adjustment in the determined intensity of emotional tone or expression in said AI-generated response accordingly.
A method 200 for sentiment-driven emotional context adaptation in artificial intelligence (AI) generated text, comprising the steps of:
(at step 202) receiving input text from a user in a data reception module 102;
(at step 204) determining, by a sentiment analysis module 104, the emotional tone of said input text, including emotions such as happiness, sadness, or anger;
(at step 206) categorizing, by said sentiment analysis module 104, the determined emotional tone as positive, negative, or neutral;
(at step 208) producing, by a response generation module 106, an AI-generated response that reflects the determined emotional tone of said input text;
(at step 210) adjusting, by said response generation module 106, said AI-generated response based on the determined emotional tone;
(at step 212) incorporating, by an emotional cues integration module 108, emoticons or specific wording into said AI-generated response to convey empathy and understanding; and
(at step 214) receiving, by a feedback module 110, feedback from a user to maintain the contextual relevance and emotional resonance of said AI-generated response.

SENTIMENT DRIVEN EMOTIONAL CONTEXT ADAPTION AND EMPATHETIC RESPONSE GENERATION IN AI GENERATED TEXT

The present disclosure provides a system (100) for enhancing artificial intelligence (AI)-generated text with sentiment-driven emotional context adaptation and empathetic response generation. The system features a data reception module (102) for acquiring input text from users, and a sentiment analysis module (104) that determines and categorizes the emotional tone of the text, including complex emotions like happiness, sadness, or anger. A response generation module (106) then creates and adjusts AI-generated responses to mirror the identified emotional tones. An emotional cues integration module (108) enriches responses with emoticons or empathetic wording, while a feedback module (110) refines the performance of the system, maintaining contextually relevant and emotionally resonant communication.

Drawings
/FIG. 1
/ FIG. 2

/Fig. 3

/ Fig. 4

, Claims:I/We claims:

A system 100 for sentiment-driven emotional context adaptation and empathetic response generation in artificial intelligence (AI) generated text, the system 100 comprising:
a data reception module 102 receives input text from a user;
a sentiment analysis module 104 configured to:
determine the emotional tone of said input text, wherein said emotional tone including happiness, sadness, or anger; and
categorize the determined emotional tone as positive, negative, or neutral;
a response generation module 106 configured to:
produce an AI-generated response that reflects the determined emotional tone of said input text; and
adjust said AI-generated response based on the determined emotional tone;
an emotional cues integration module 108 configured to incorporate emoticons or specific wording into said AI-generated response to convey empathy and understanding; and
a feedback module 110 receives feedback from a user to maintain the contextual relevance and emotional resonance of said AI-generated response.
The system 100 of claim 1, wherein the sentiment analysis module 104 further comprises a natural language processing engine that utilizes machine learning algorithms for said emotional tone detection.
The system 100 of claim 1, wherein the response generation module 106 further comprises a dialogue management unit that tailors the structure of said AI-generated response to emulate conversational patterns observed in human communication.
The system 100 of claim 1, wherein the emotional cues integration module 108 is further configured to select emoticons or specific wording based on a cultural and linguistic database that accounts for regional and demographic variations in emotional expression.
The system 100 of claim 1, wherein the feedback module 110 further comprises an adaptive learning component that updates response strategies based on user feedback patterns.
The system 100 of claim 1, wherein the sentiment analysis module 104 is further configured to utilize deep learning networks to discern underlying emotional subtleties in said input text that are not explicit.
The system 100 of claim 1, wherein the response generation module 106 is further configured to apply syntactic and semantic rules of the language of said input text to maintain grammatical coherence in said AI-generated response.
The system 100 of claim 1, wherein the feedback module 110 further includes a sentiment drift detection component that identifies shifts in user sentiment over the course of interactions to adjust said emotional tone.
The system 100 of claim 1, wherein the sentiment analysis module 104 is further configured to:
determine the intensity of the emotional tone; and
recognize an adjustment in the determined intensity of emotional tone or expression in said AI-generated response accordingly.
A method 200 for sentiment-driven emotional context adaptation in artificial intelligence (AI) generated text, comprising the steps of:
(at step 202) receiving input text from a user in a data reception module 102;
(at step 204) determining, by a sentiment analysis module 104, the emotional tone of said input text, including emotions such as happiness, sadness, or anger;
(at step 206) categorizing, by said sentiment analysis module 104, the determined emotional tone as positive, negative, or neutral;
(at step 208) producing, by a response generation module 106, an AI-generated response that reflects the determined emotional tone of said input text;
(at step 210) adjusting, by said response generation module 106, said AI-generated response based on the determined emotional tone;
(at step 212) incorporating, by an emotional cues integration module 108, emoticons or specific wording into said AI-generated response to convey empathy and understanding; and
(at step 214) receiving, by a feedback module 110, feedback from a user to maintain the contextual relevance and emotional resonance of said AI-generated response.

SENTIMENT DRIVEN EMOTIONAL CONTEXT ADAPTION AND EMPATHETIC RESPONSE GENERATION IN AI GENERATED TEXT

Documents

Application Documents

# Name Date
1 202421033119-OTHERS [26-04-2024(online)].pdf 2024-04-26
2 202421033119-FORM FOR SMALL ENTITY(FORM-28) [26-04-2024(online)].pdf 2024-04-26
3 202421033119-FORM 1 [26-04-2024(online)].pdf 2024-04-26
4 202421033119-EVIDENCE FOR REGISTRATION UNDER SSI(FORM-28) [26-04-2024(online)].pdf 2024-04-26
5 202421033119-EDUCATIONAL INSTITUTION(S) [26-04-2024(online)].pdf 2024-04-26
6 202421033119-DRAWINGS [26-04-2024(online)].pdf 2024-04-26
7 202421033119-DECLARATION OF INVENTORSHIP (FORM 5) [26-04-2024(online)].pdf 2024-04-26
8 202421033119-COMPLETE SPECIFICATION [26-04-2024(online)].pdf 2024-04-26
9 202421033119-FORM-9 [07-05-2024(online)].pdf 2024-05-07
10 202421033119-FORM 18 [08-05-2024(online)].pdf 2024-05-08
11 202421033119-FORM-26 [12-05-2024(online)].pdf 2024-05-12
12 202421033119-FORM 3 [13-06-2024(online)].pdf 2024-06-13
13 202421033119-RELEVANT DOCUMENTS [09-10-2024(online)].pdf 2024-10-09
14 202421033119-POA [09-10-2024(online)].pdf 2024-10-09
15 202421033119-FORM 13 [09-10-2024(online)].pdf 2024-10-09
16 202421033119-FER.pdf 2025-07-28
17 202421033119-FORM-8 [03-09-2025(online)].pdf 2025-09-03
18 202421033119-FER_SER_REPLY [03-09-2025(online)].pdf 2025-09-03
19 202421033119-DRAWING [03-09-2025(online)].pdf 2025-09-03
20 202421033119-CORRESPONDENCE [03-09-2025(online)].pdf 2025-09-03

Search Strategy

1 202421033119_SearchStrategyNew_E_SearchStrategyE_18-03-2025.pdf