piggymouse: (umlactor)
[personal profile] piggymouse

Funny as it is, just last week I chatted with one of the friends on some possible non-mainstream applications of search engines. Here's another one and it is completely breathtaking. Here's the blog entry and here's the Slashdot post.

…The underlying principle is easy – so easy in fact that the researchers working on this enabled the system to translate from Chinese to English without any researcher being able to speak Chinese. To the translation system, any language is treated the same, and there is no manually created rule-set of grammar, metaphors and such. Instead, the system is learning from existing human translations. Google relies on a large corpus of texts which are available in multiple languages.

<…>

All it needs is someone to feed the system two books and to teach it the two are translations from language A to language B, and the translator can create what Franz Och called a "language model." I suspect it’s crucial that the body of text is immensely large, or else the system in its task of translating would stumble upon too many unlearned phrases. Google used the United Nations Documents to train their machine, and all in all fed 200 billion words. This is brute force AI, if you want – it works on statistical learning theory only and has not much real "understanding" of anything but patterns.

A question to [livejournal.com profile] dimkaguarani and other experts – is it possible that our brains employ similar structures – large pattern maps built from experience?

Date: 2005-06-01 08:24 am (UTC)
From: [identity profile] cjelli.livejournal.com
Yes, our brains do, and computers don't.

The key is the massive parallel processing and IO bandwidth of our brains which isn't yet on computers'
horizon. Note that there are words and translations which are ambigious even to humans.

Date: 2005-06-01 09:48 am (UTC)
nine_k: A stream of colors expanding from brain (Default)
From: [personal profile] nine_k
OTOH, if someone has a really massively parallel computer installation, it is Google.

I wonder if/when custom ASICs that efficiently implement neural nets are going to be mass-produced someday. We already have dedicated silicon 3D engines, and physics engines are emerging.

Date: 2005-06-01 02:50 pm (UTC)
From: [identity profile] dimkaguarani.livejournal.com
AFAIK, they do, though the actual degree of reliance on such structures is disputable.

Date: 2005-06-01 04:17 pm (UTC)
From: [identity profile] ivan-ghandhi.livejournal.com
Это напоминает мне одну миниатюрку из "физики шутят" -

I am thankful to Dr.X for translating the article.
I am thankful to Dr.X for translating the note above.
I am thankful to Dr.X for translating the note above.

Здесь цепочка обрывается, т.к. автор оказался способен переписать текст дословно.

Profile

piggymouse: (Default)
piggymouse

April 2011

S M T W T F S
     1 2
34 56 789
10 1112 13141516
17181920212223
24252627282930

Most Popular Tags

Style Credit

Expand Cut Tags

No cut tags
Page generated Jul. 16th, 2025 10:21 am
Powered by Dreamwidth Studios