→ innominate: 我其實看不懂你要表達什麼,我只是在簡單科普香農的理 09/25 19:21
→ innominate: 論而已 09/25 19:22
→ innominate: 信息量跟信息是否正確說兩碼子事 09/25 19:22
→ innominate: 好比你舉的例子,plamc是恐龍,這個信息量很大,但未 09/25 19:25
→ innominate: 必為真。我再簡單舉個比較容易懂的例子,好比我現在要 09/25 19:26
→ innominate: 存心騙一個人,我產生一個謊言,這個謊言一樣帶有信息 09/25 19:26
→ innominate: 量,我的目的是要把這個信息量傳遞給接收者 09/25 19:26
→ innominate: 所以香農公式裡面前面帶有一個負號以保證信息量永遠為 09/25 19:28
→ innominate: 正。其實你跟我扯這個真沒意思,要不你去推翻香農熵的 09/25 19:28
→ innominate: 概念,我幫你讚聲 09/25 19:28
→ innominate: on_theory) 09/25 19:43
→ innominate: For instance, the knowledge that some particular n 09/25 19:53
→ innominate: umber will not be the winning number of a lottery 09/25 19:53
→ innominate: provides very little information, because any part 09/25 19:53
→ innominate: icular chosen number will almost certainly not win 09/25 19:53
→ innominate: . However, knowledge that a particular number will 09/25 19:54
→ innominate: win a lottery has high informational value becaus 09/25 19:54
→ innominate: e it communicates the outcome of a very low probab 09/25 19:54
→ innominate: ility event. 09/25 19:54
推 joh: 說得沒錯,也說到重點了,樓上你仔細看看 09/25 21:38
→ innominate: 我不知道我要仔細看什麼?科普個香農的信息熵罷了 09/25 23:49
→ innominate: 你們要取推翻信息論是你們家的事 09/25 23:49
→ innominate: 我上面引的維基百科的英文不知道樓上看了沒 09/25 23:50
推 tomer: 他在教你信息量和熵是怎麼來的。 09/26 00:30
→ tomer: 其實從這邊就能看出誰受過正規的數學訓練、誰只是看圖說故 09/26 00:32
→ tomer: 事w 09/26 00:32
→ innominate: 連信息熵定義都沒搞懂的能教我什麼?我上面引的英文看 09/26 03:38
→ innominate: 了沒?連英文都要我幫你們翻譯嗎? 09/26 03:38
→ innominate: ers/shannon/entropy/entropy.pdf 09/26 04:05
→ innominate: 香農的原文自己去看 09/26 04:05
→ innominate: 我大概理解ZM為何有的時候根本不想解釋了,馬的給你們 09/26 04:09
→ innominate: 科普還要被槓 09/26 04:09
→ innominate: 講的信息論好像我發明的一樣,你們要槓去槓香農啦 09/26 04:10
→ innominate: 我的記憶都回來了,當年在歷史版也科普過惡性通膨的經 09/26 04:12
→ innominate: 濟理論,也一堆槓精來槓 09/26 04:12
→ innominate: 我再引維基百科的英文描述如下:The core idea of inf 09/26 04:22
→ innominate: ormation theory is that the "informational value" 09/26 04:23
→ innominate: of a communicated message depends on the degree to 09/26 04:23
→ innominate: which the content of the message is surprising. I 09/26 04:23
→ innominate: f a highly likely event occurs, the message carrie 09/26 04:23
→ innominate: s very little information. On the other hand, if a 09/26 04:23
→ innominate: highly unlikely event occurs, the message is much 09/26 04:24
→ innominate: more informative. 09/26 04:24
→ innominate: 維基也不看,論文也不看,書也不看,就只會槓 09/26 04:25
→ innominate: 然後本篇錯誤的把「單位量」跟「熵」切開,不是這樣好 09/26 04:42
→ innominate: 嗎?依照他自己引用的維基原文:The 「shannon」also 09/26 04:43
→ innominate: serves as a unit of the information entropy of an 09/26 04:43
→ innominate: event, which is defined as the expected value of t 09/26 04:43
→ innominate: he information content of the event 。是熵的單位可 09/26 04:43
→ innominate: 以為SH 09/26 04:43
推 tomer: 就,維基不是聖經啊XD然後你和 Z麥一樣毛病、一直認定只有 09/26 07:16
→ tomer: 你倆看得懂英文、維基上幾句英文就當作是真理目空一切到處 09/26 07:17
→ tomer: 亂套。注意我不是說維基講錯喔,是你受的數學訓練不夠導致 09/26 07:17
→ tomer: 理解出現盲區。 f 在這邊講的需要的是機率論中隨機事件的基 09/26 07:17
→ tomer: 本概念,大概是大二修機率論時教授一開始會給的,也算是ABC 09/26 07:17
→ tomer: 等級的東西吧。他稍微帶了下從機率論的視角怎麼理解夏農熵 09/26 07:17
→ tomer: 和資訊量、畢竟整個理論是從機率論推導出來的。但很明顯你 09/26 07:18
→ tomer: 沒受過相關的訓練又想要反駁,就只能一直跳針叫人去看英文 09/26 07:18
→ tomer: 維基的幾句話。說實在只能微笑呀。 09/26 07:18
推 joh: 把wiki當神....都不知道地雷都在細節中 09/26 09:53

→ innominate: 所以槓精只能跳針大二機率論,你們連信息學都看不懂, 09/26 11:56
→ innominate: 自己引的維基都不看,要說先引維基的不是我,你們立場 09/26 11:56
→ innominate: 對了就不質疑他,歷史版槓精就是拿大學基礎課程來嗆別 09/26 11:56
→ innominate: 人 09/26 11:56
→ innominate: 哪怕我引用的維基內容有誤吧,你們好歹也要說明是哪句 09/26 11:59
→ innominate: 話有誤,論文我也放了,你們不針對內文討論,在那邊跳 09/26 11:59
→ innominate: 針大二機率,我通訊/計算機研究所畢業的還要你們教我 09/26 11:59
→ innominate: 機率? 09/26 12:00
推 joh: 也沒人針對機率論,針對的事情不只有這些 09/26 14:13
→ ZMittermeyer: 他們兩個不是槓精是看不懂 和幾個基本定義搞反 09/26 22:27
→ ZMittermeyer: 中文腦會天生搞反一些基本定義 09/26 22:27
→ ZMittermeyer: 我發現動態腦數學腦 和靜態腦文字腦 是兩種系統 09/27 00:08
→ ZMittermeyer: 你把基礎搞反之後 演繹推理會蓋出相反大廈 09/27 00:08