CNN Based Self Attention Mechanism for Cross Model receipt Generation for Food Industry
DOI:
https://doi.org/10.2583/Keywords:
Blockchain, CNN, Attention layer, Cross Model, receipt Generation, Food IndustryAbstract
Diet management requires keeping track of what you eat. The researchers presented a recipe retrieval technique based on food photos that retrieves the related recipes from the taken images and creates nutritional information accordingly, making recording more convenient. The retrieval of recipes is an example of a cross-modal retrieval challenge. Still, as compared to other challenges, the main challenge is that recipes explain a succession of modifications from raw ingredients to completed goods rather than immediately apparent characteristics. As a result, the model must have a thorough understanding of the raw materials processing process. Current recipe retrieval research, on the other hand, uses a linear approach to text processing, which makes it difficult to capture long-range relationships during recipe processing. A cross-modal recipe retrieval model that is based on the self-attention mechanism is currently being developed in order to overcome this difficulty. The model makes use of the Transformer model’s self-attention mechanism to effectively capture long-distance interactions in recipes. Additionally, the model improves upon the attention mechanisms of prior techniques in order to mine the semantics of recipes more effectively. The approach enhances the recall rate of the recipe retrieval task by 22% over the baseline strategy, according to experimental data.
Downloads
Published
Issue
Section
License
Copyright (c) 2023 Ismail Keshta, Mukesh Soni
This work is licensed under a Creative Commons Attribution-NonCommercial 4.0 International License.
JESM Journal operates under the Attribution-NonCommercial 4.0 International (CC BY-NC 4.0). This allows others distribute, remix, tweak, and build upon the work, even commercially, as long as they credit the authors for the original creation. All authors publishing in JESM Journal accept these as the terms of publication.