Jump to content
Info
  • Welcome to PhCyber
  • Explore and Enjoy Browsing!
  • Lots of useful topics
  • Stay active and receive prizes
DOH launches COVID-19 Hotlines: 1555 (PLDT, Smart, Sun, and TNT subscribers) and (02) 894-26843 (894-COVID).

Search the Community

Showing results for tags 'meena'.



More search options

  • Search By Tags

    Type tags separated by commas.
  • Search By Author

Content Type


Forums

  • General Topic
    • Official Links
    • Announcement/News
    • Introduction
    • Suggestions/Inquiries
    • Official Tutorials
    • Rules and Regulations
  • VPN Section
    • Mobile Version
    • PC Version
  • PhCyber Lounge
    • VPN Section
    • General Topics and Chat
    • Premium Accounts
  • World Wide Web
    • Programming and Web Development
    • Online Courses
    • Digital Currency
    • Referrals Section
  • Game Section
    • Mobile Gaming
    • PC Gaming
    • Console Gaming
  • Application Section
    • Mobile Version
    • PC Version
  • Technology
    • Gadgets & Gizmos
    • News & Inventions
    • Thesis & Research
  • Technician's Lounge
    • Tools & Software
    • Help & Tutorials
  • Commercial
    • Buy and Sell
    • Business Admin
    • Starting Business
  • Lifestyle
    • Health & Fitness
    • Fashions
    • Travels & Liessure
    • Foods & Drinks
  • Culture
  • Arts & Literature
  • Entertainment Section

Product Groups

  • C-Coins
  • Reseller C-Coins
  • VPN Subscription
  • Adverstise
  • VPN Services
  • Donations

Categories

  • PC Apps
  • Mobile Apps
  • Source Codes
  • Collection
  • E-Books

Categories

  • PhCyber Forum
    • How to Create a Topic
    • How to Upload Files
    • How to Give Reaction
    • Use of Spoiler Button VS Hide ( [hide]..[/hide] ) Code
  • C-Coins
    • What is C-Coin?
    • How To Earn C-Coins
  • VPN Panel
    • How to Create a VPN Account
    • How to Edit VPN Accounts
    • Redeem a VPN Account
  • Public SSH Panel
    • How to Create an SSH Account
    • How to Edit SSH Accounts
  • Files Section
    • Upload Files
    • Download Files

Find results in...

Find results that contain...


Date Created

  • Start

    End


Last Updated

  • Start

    End


Filter by number of...

Joined

  • Start

    End


Group


About Me


Mobile Number

Found 1 result

  1. Google has released a neural-network-powered chatbot called Meena that it claims is better than any other chatbot out there. Data slurp: Meena was trained on a whopping 341 gigabytes of public social-media chatter—8.5 times as much data as OpenAI’s GPT-2. Google says Meena can talk about pretty much anything, and can even make up (bad) jokes. Why it matters: Open-ended conversation that covers a wide range of topics is hard, and most chatbots can’t keep up. At some point most say things that make no sense or reveal a lack of basic knowledge about the world. A chatbot that avoids such mistakes will go a long way toward making AIs feel more human, and make characters in video games more lifelike. Sense and specificity: To put Meena to the test, Google has developed a new metric it calls the Sensibleness and Specificity Average (SSA), which captures important attributes for natural conversations, such as whether each utterance makes sense in context—which many chatbots can do—and is specific to what has just been said, which is harder. What do you mean? For example, if you say “I like tennis” and a chatbot replies “That’s nice,” the response makes sense but is not specific. Many chatbots rely on tricks like this to hide the fact that they don’t know what you’re talking about. On the other hand, a response such as “Me too—I can’t get enough of Roger Federer” is specific. Google used crowdworkers to generate sample conversations and to score utterances in around 100 conversations. Meena got an SSA score of 79%, compared with 56% for Mitsuku, a state-of-the-art chatbot that has won the Loebner Prize for the last four years. Even human conversation partners only scored 86% in this new test. Can I talk to Meena? Not yet. Google says it won’t be releasing a public demo until it has vetted the model for safety and bias, which is probably a good thing. When Microsoft released its chatbot Tay on Twitter in 2016, it started spewing racist, misogynistic invective within hours.

General Chat

General Chat

Please enter your display name

×
×
  • Create New...