Sentiment Analysis of Brand
“Positive sentiment” doesn’t mean users love you. Might mean they’re laughing at you.
Your protocol’s social listening tool shows 85% positive sentiment. Great, right? You dig deeper. The “positive” mentions are people saying “Can’t wait for this rug to pull lmao” and “This is so bad it’s funny.”
The algorithm saw “can’t wait” (positive) and “funny” (positive). Scored it positive. Completely missed the sarcasm, mockery, and genuine negativity.
This happens constantly. Basic sentiment analysis in crypto is worse than useless. It’s misleading. Makes you think brand health is good when it’s actually terrible. Or shows negative when community is just being ironically supportive (”this protocol fucks” reads as negative to algorithms).
Crypto communication is sarcastic, ironic, meme-heavy, and context-dependent. Standard sentiment tools built for corporate PR don’t work. Need different approach entirely.
Going beyond positive/negative to understand actual brand health requires measuring emotional associations, community energy, crisis resilience, and doing it where users actually communicate (mostly mobile, mostly Twitter/Telegram/Discord).
Why basic sentiment fails in crypto
Standard sentiment analysis looks for keywords. “Love,” “great,” “amazing” = positive. “Hate,” “terrible,” “awful” = negative.
Crypto breaks this completely:
Irony and sarcasm: “This protocol is so bad I’m all in” reads as negative. Actually means user is bullish in ironic way.
Slang and neologisms: “This fucks,” “absolutely rekt,” “ngmi,” “wagmi.” Tools don’t understand these. Classify randomly.
Context dependence: “Dumping everything” is negative if talking about price. Positive if talking about competitor using your protocol.
Emoji meaning: “💀” can mean dying (negative), dying of laughter (positive), or skull emoji meta-irony (neutral observational).
Community inside jokes: References only community understands. Sentiment tools have no context. Classify wrong.
FUD as strategy: Intentional Fear, Uncertainty, Doubt campaigns. Coordinated negativity. Need to detect this separately from organic criticism.
Real example from Terra/Luna collapse:
Standard sentiment showed 60% negative two weeks before collapse. Missed the severity. “Negative” captured general market fear, not existential protocol risk.
Better analysis would have caught: Panic tone. Desperation in language. Community turning on founders. Emotional shift from confident to desperate.
Tone and emotion matter more than simple positive/negative.
Emotional brand associations
What feelings does your brand evoke? This matters more than sentiment score.
The core emotions:
Trust: “This protocol has never failed me.” Core for any financial protocol. Measurable through consistency of usage, returning users, funds left on protocol.
Excitement: “Can’t wait to try this new feature.” Drives growth. Measurable through early adoption, feature requests, speculation.
Fear: “What if this gets hacked?” Can be healthy (drives caution) or unhealthy (drives exodus). Measurable through withdrawal patterns, questions about security.
Anger: “These fees are ridiculous.” Dangerous emotion. Measurable through complaint volume, competitor comparisons, community tone shift.
Delight: “This UX is so smooth.” Drives word-of-mouth. Measurable through voluntary sharing, unsolicited praise, recommendation rate.
Boredom: “This protocol never does anything new.” Kills momentum. Measurable through declining engagement, conversation moving to competitors.
Superiority: “I use X, plebs use Y.” Status-driven usage. Measurable through flex posting, PFP usage, community gatekeeping.
How to measure:
Survey direct questions: “When you think of [Protocol], what emotion do you feel first?” Multiple choice with emotions listed.
Analyze language patterns: Look for emotion words in mentions. “Scared,” “excited,” “frustrated,” “impressed.” Frequency indicates dominant emotions.
Behavioral proxies: Trust = funds staying on protocol. Excitement = early feature adoption. Fear = withdrawals during volatility.
Temporal analysis: How emotions shift over time. Trust built slowly. Excitement spikes then fades. Fear can appear suddenly.
Competitor comparison: Users describe your brand emotionally different from competitors? That’s positioning.
Real patterns:
Uniswap: Trust + boredom. Reliable but not exciting. Measurement: Consistent volume, low social buzz, high retention.
SushiSwap: Anger + fatigue. Drama exhaustion. Measurement: Negative mentions spiking, engagement declining, team turnover discussions.
Base: Excitement + belonging. New and growing. Measurement: High voluntary sharing, community pride, rapid adoption.
Solana: Excitement + frustration. Fast but breaks. Measurement: Speed praise + downtime complaints in same conversations.
Each protocol has emotional signature. Changes over time. Need to track actively.
Community vibe measurement
Vibe isn’t sentiment. It’s energy, engagement level, tone of conversation. Vibe can be positive but low energy (dying) or negative but high energy (growing).
The vibe matrix:
Positive + High Energy = Growth mode. Community excited. Active. Building. Best state.
Positive + Low Energy = Comfortable plateau. Community satisfied but not growing. Stable but not exciting.
Negative + High Energy = Crisis or controversy. Community active but angry/scared. Dangerous but fixable.
Negative + Low Energy = Death spiral. Community checked out. Not even angry anymore. Just gone.
Metrics that capture vibe:
Message velocity: How many messages per hour in Discord/Telegram? Rising = high energy. Declining = low energy.
Response time: How quickly do community members respond to each other? Fast = high engagement. Slow = low engagement.
Thread depth: How long do conversations continue? Deep threads = high interest. Shallow = low interest.
Off-topic ratio: How much non-protocol discussion? Some is healthy (community bonding). Too much = disengagement from protocol.
Emoji usage: High emoji = high energy, expressive community. Low emoji = low energy or serious tone.
Meme generation: Community making new memes about protocol? High creativity = high energy. Reusing old memes = stagnation.
New member integration: New people joining conversations? Welcomed? High integration = healthy community. Ignored = insular/dying.
Example tracking:
Friend.tech peak (Aug 2023):
Message velocity: 500+ messages/hour in Discord
Response time: Under 2 minutes average
Thread depth: 20+ messages per discussion
Meme generation: Daily new variations
Vibe: Positive + high energy = growth mode
Friend.tech decline (Dec 2023):
Message velocity: 50 messages/hour
Response time: 30+ minutes average
Thread depth: 3-5 messages per discussion
Meme generation: None, reusing old memes
Vibe: Negative + low energy = death spiral
Same protocol. 4 months. Complete vibe shift. Measurable through engagement metrics, not just sentiment.
Crisis brand impact tracking
Exploits, hacks, controversies. How does brand weather crisis? This reveals true brand strength.
Before/during/after framework:
Pre-crisis baseline:
Sentiment distribution (what’s normal?)
Message volume (usual activity level)
Dominant emotions (baseline emotional state)
Community cohesion (how united?)
During crisis:
Sentiment shift speed (how fast did it turn?)
Message spike (10x? 100x normal volume?)
Emotional shift (fear? anger? what dominated?)
Community fracturing (splits forming?)
Post-crisis recovery:
Sentiment recovery rate (back to baseline when?)
Message volume normalization (activity returned?)
Emotional healing (trust rebuilt or permanently damaged?)
Community composition (same people or new?)
Real examples:
Wormhole hack (Feb 2022):
Pre-crisis: 60% positive, 30% neutral, 10% negative
Crisis peak: 15% positive, 20% neutral, 65% negative
30 days post: 45% positive, 35% neutral, 20% negative
90 days post: 55% positive, 30% neutral, 15% negative
Brand recovery: Strong. Almost back to baseline in 90 days. Why? Quick response, made users whole, transparent communication.
Terra/Luna collapse (May 2022):
Pre-crisis: 70% positive, 20% neutral, 10% negative
Crisis peak: 5% positive, 10% neutral, 85% negative
30 days post: 8% positive, 15% neutral, 77% negative
90 days post: 10% positive, 20% neutral, 70% negative
Brand recovery: Failed. Never recovered. Why? Fundamental protocol failure, users lost everything, no path to making whole.
What to track in crisis:
Sentiment velocity: How fast sentiment shifts indicates crisis severity. Slow shift = manageable. Instant collapse = existential.
Emotion dominance: Fear (recoverable) vs anger (harder) vs despair (very hard). Fear can be addressed. Despair means people giving up.
Influencer sentiment: When respected community voices turn negative, crisis deepens. When they stay supportive, containable.
Competitor mentions: Users comparing to competitors during crisis = considering leaving. Track share of voice shift.
Support request nature: Technical questions (solvable) vs emotional venting (trust damage) vs exit planning (leaving).
Media amplification: Crisis contained to crypto Twitter (manageable) vs mainstream media (brand damage spreads).
Recovery trajectory: Linear recovery (steady improvement) vs stalled (stuck at negative) vs deteriorating (getting worse).
Mobile sentiment data collection
Most crypto happens on mobile. Twitter, Discord, Telegram all primarily mobile. Desktop sentiment analysis misses majority of conversations.
Mobile-specific considerations:
Short-form dominates: Mobile users type less. Tweets shorter. Messages more emoji-heavy. Less nuance to analyze.
Voice and video: Mobile users send voice messages (Telegram) and videos (TikTok). Text analysis misses this entirely.
Ephemeral content: Instagram Stories, Snapchat, disappearing messages. Gone before you can analyze.
Context collapse: Mobile users jump between apps. Single conversation fragments across platforms. Hard to track.
Real-time nature: Mobile sentiment shifts faster. Desktop analysis runs batch jobs. Misses rapid mobile-driven shifts.
Collection methods:
Twitter API: Capture tweets mentioning brand. Filter by verified users, engagement level, follower count. Mobile-originated tweets have different characteristics (shorter, more emoji, more images).
Discord monitoring: Track message volume, emoji reactions, thread engagement. Discord mobile app used by 70%+ of users. Mobile patterns different from desktop.
Telegram scraping: Public channels and groups. High mobile usage. Voice messages require transcription. Stickers and emoji carry meaning.
Reddit analysis: Crypto subreddits. Mobile vs desktop posting has different tone. Mobile comments shorter, more reactive.
TikTok/Instagram: Visual sentiment. Can’t just analyze text. Need image/video analysis. Tone of voice in videos. Facial expressions.
Mobile signal differences:
Emoji frequency: Mobile users emoji-heavy. Desktop users text-heavy. “🚀🚀🚀” on mobile = “very bullish” in desktop text.
Voice message tone: Telegram voice messages reveal emotion text doesn’t. Excitement, fear, anger obvious in voice.
Video sentiment: TikTok reactions to protocol. Facial expressions and tone more honest than text. Sarcasm detectable.
Image memes: Sharing memes about protocol. Need image recognition + cultural context. Standard sentiment can’t parse this.
Typing speed patterns: Mobile typos vs desktop typos. Fast mobile typing indicates urgency/emotion. Slow careful typing indicates importance.
Tools that work for mobile:
Custom Discord bots: Track real-time engagement, emoji usage, message patterns. Captures mobile activity naturally.
Twitter streaming API: Real-time tweet capture. Mobile tweets identifiable. Can separate mobile vs desktop sentiment.
Telegram monitoring: Bot-based monitoring of public groups. Voice transcription services. Sticker sentiment mapping.
Video analysis tools: For TikTok/Instagram. Facial emotion recognition, tone analysis, visual sentiment.
Multi-platform aggregation: Single dashboard showing sentiment across all platforms. Mobile and desktop combined.
What actually matters
After analyzing sentiment for dozens of protocols, patterns emerge:
Good indicators:
Voluntary sharing: Users sharing protocol without prompting. Measured by quote tweets, unpaid mentions, organic discussion.
Recommendation rate: Users recommending to friends. Measured through surveys, referral tracking, “shill me” responses.
Crisis resilience: Sentiment recovery after negative events. Strong brands bounce back. Weak brands don’t.
Emotional diversity: Healthy brands evoke mix of emotions (trust + excitement). Unhealthy brands = single emotion (fear or anger).
Community defense: When outsiders criticize, community defends brand? Strong brand loyalty signal.
Bad indicators:
Paid mention volume: High volume from paid influencers. Looks good superficially. Actually masks weak organic interest.
Positive sentiment from bots: Bot armies praising protocol. Skews sentiment positive but meaningless.
Quiet criticism: No visible negative sentiment because critics already left. Silence isn’t health.
Forced enthusiasm: Community managers pumping positivity. Real community quiet. Manufactured sentiment.
Meme stagnation: Reusing same memes for months. Community creativity dead. Sign of declining energy.
The practical playbook
If you’re tracking brand sentiment properly:
Weekly monitoring:
Platform sentiment: Twitter, Discord, Telegram, Reddit. Aggregate scores. Track trends.
Emotion distribution: What emotions appearing most? Shifting toward fear/anger (bad) or trust/excitement (good)?
Vibe check: Message velocity, response time, thread depth. Community energy level?
Crisis watch: Any emerging issues? Small complaints becoming bigger?
Competitor comparison: Your sentiment vs competitors. Relative brand health.
Monthly deep dive:
Sentiment drivers: What’s causing positive sentiment? What’s causing negative? Specific features, events, decisions?
Emotional evolution: How emotions shifting over time? Building trust? Losing excitement? Track trajectory.
Community cohesion: Community more united or fracturing? Subgroups forming? Healthy debate or toxic splits?
Influencer alignment: Top voices supportive, neutral, or critical? Shifting which direction?
Recovery analysis: Any recent crises? How well did sentiment recover? Faster or slower than expected?
Quarterly strategy:
Brand health scorecard: Overall sentiment, emotion distribution, community vibe, crisis resilience. Grade A-F.
Competitive positioning: How brand sentiment compares to competitors. Improving or declining relative position?
Investment priorities: Where to invest based on sentiment data? Community building? Crisis prevention? Trust building?
Narrative adjustment: Current brand narrative working? Sentiment suggests different positioning?
Platform strategy: Which platforms showing best/worst sentiment? Invest more or differently?
The honest take
Most crypto projects either:
Don’t track sentiment (flying blind). Or track it wrong (basic positive/negative that misses everything important).
The right approach:
Track emotions not just sentiment. Trust, excitement, fear, anger matter more than positive/negative scores.
Measure vibe not just volume. High-energy engaged community beats large quiet community.
Monitor crisis resilience. Brand strength shows in recovery speed from negative events.
Collect from mobile. That’s where conversations actually happen. Desktop-only analysis misses majority.
Go beyond text. Voice, video, images, memes carry sentiment text analysis can’t capture.
Real-time tracking. Mobile sentiment shifts fast. Batch analysis misses the shift.
The tools exist but most are built for corporate PR not crypto culture. Need to adapt methods for sarcasm, memes, irony, slang.
Or build custom. Discord bots, Telegram monitoring, Twitter streaming, custom dashboards. More work but actually accurate.
Sentiment analysis done right predicts problems before they explode, identifies brand strengths to amplify, shows crisis recovery in real-time.
Done wrong, gives false confidence while brand deteriorates underneath.
Most protocols choose wrong. Few do it right. Competitive advantage for those who actually measure what matters.
Your brand health isn’t what you think it is. It’s what your community feels. Measure those feelings accurately or get surprised when sentiment you thought was good turns out to be sarcastic mockery.
Thank you :)
If your project needs design, brand, product, strategy, and leadership,
let’s talk, hi@dragoon [dot] xyz | Follow: 0xDragoon



