Post by account_disabled on Dec 9, 2023 0:04:34 GMT -5
The principle of Emojify is the same as what we want to use to analyze the emotions of customers, that is, they will try to recognize that the facial expression we are expressing right now means that we are,,,, Happines, Sadness From here, if we want to fine-tune our Deep Learning Model to be more specific. Or even reducing the number of Emotion Segments can be done as well =>> but you have to train the model yourself haha. Personally, Nick recommends OpenCV. It's not too difficult to understand.
By if friends If you want to try studying, you can go Whatsapp Number List and try following this video,,,, I guarantee that it's not difficult ( •̀ ω •́ )✧ Coming back to our story and Emojify, when we enter* it will ask us to turn on the camera, give us Allow. After that, try to see the results as a way to predict the mood from the camera in Real-time. Right away. In this section, Nick tries his hand at suicide 55+ to test his accuracy for his friends. It is thought that the emotions that customers are likely to express when paying for products are as follows. # Happiness : Very happy, cheap and very good. Emotion AI in marketing, reaching customers' emotions with Image Processing .
What Nick knows is While wearing glasses and don't wear glasses How much difference will the results obtained from the model have? It was found that when not wearing glasses the model predicted Happy 0.7 Neutral 0.26 and while wearing glasses the model predicted Happy 0.88 Neutral 0.11. This allows us to further hypothesize that there are new facial elements such as glasses - no glasses, hats. -No hats or anything else may affect the results. (However, Nick would like to add an additional note that The facial expressions shown in both pictures are not 100% the same ^^) # Disgust : Ha!! What? What's the price? Emotion AI in marketing, reaching customers' emotions with Image Processing This one is a martyr when I'm confused about the price.
By if friends If you want to try studying, you can go Whatsapp Number List and try following this video,,,, I guarantee that it's not difficult ( •̀ ω •́ )✧ Coming back to our story and Emojify, when we enter* it will ask us to turn on the camera, give us Allow. After that, try to see the results as a way to predict the mood from the camera in Real-time. Right away. In this section, Nick tries his hand at suicide 55+ to test his accuracy for his friends. It is thought that the emotions that customers are likely to express when paying for products are as follows. # Happiness : Very happy, cheap and very good. Emotion AI in marketing, reaching customers' emotions with Image Processing .
What Nick knows is While wearing glasses and don't wear glasses How much difference will the results obtained from the model have? It was found that when not wearing glasses the model predicted Happy 0.7 Neutral 0.26 and while wearing glasses the model predicted Happy 0.88 Neutral 0.11. This allows us to further hypothesize that there are new facial elements such as glasses - no glasses, hats. -No hats or anything else may affect the results. (However, Nick would like to add an additional note that The facial expressions shown in both pictures are not 100% the same ^^) # Disgust : Ha!! What? What's the price? Emotion AI in marketing, reaching customers' emotions with Image Processing This one is a martyr when I'm confused about the price.