IPB

Welcome Guest ( Log In | Register )

 
Reply to this topicStart new topic
> Math enhanced web traffic
Sendaz
post Jul 20 2014, 02:57 PM
Post #1


Runner
******

Group: Dumpshocked
Posts: 3,039
Joined: 23-March 05
From: The heart of Rywfol Emwolb Industries
Member No.: 7,216



http://www.scientificcomputing.com/news/20...10-times-faster


Go to the top of the page
 
+Quote Post
Rad
post Jul 21 2014, 01:52 AM
Post #2


Moving Target
**

Group: Members
Posts: 691
Joined: 27-February 08
From: Pismo Beach, CA
Member No.: 15,715



Interesting. Anybody got a link to the actual study?
Go to the top of the page
 
+Quote Post
Shortstraw
post Jul 21 2014, 02:09 AM
Post #3


Running Target
***

Group: Dumpshocked
Posts: 1,003
Joined: 3-May 11
From: Brisbane Australia
Member No.: 29,391



Linky

Would you like to know more?
Go to the top of the page
 
+Quote Post
Rad
post Jul 21 2014, 04:33 AM
Post #4


Moving Target
**

Group: Members
Posts: 691
Joined: 27-February 08
From: Pismo Beach, CA
Member No.: 15,715



Sweet, thanks for the info-dump, Shortstraw.
Go to the top of the page
 
+Quote Post
nezumi
post Jul 21 2014, 02:04 PM
Post #5


Incertum est quo loco te mors expectet;
*********

Group: Dumpshocked
Posts: 6,548
Joined: 24-October 03
From: DeeCee, U.S.
Member No.: 5,760



Just glancing through, it seems like the idea is to reduce dropped packets by including enough data to easily rebuild missing packets. The cost here is I imagine the overall transmission size increases. So it goes faster at the cost of being bigger.
Go to the top of the page
 
+Quote Post
Draco18s
post Jul 21 2014, 02:20 PM
Post #6


Immortal Elf
**********

Group: Members
Posts: 10,289
Joined: 2-October 08
Member No.: 16,392



It's actually not "bigger" though. You have to include the resending of dropped packets as part of the size you're comparing against (that is: if a packet is sent twice, you have to count it twice).

That said, I don't really understand the math of what it's doing. It uses Markov Chains, which could allow for packet reconstruction, but that's about all I gleaned from the white paper.
Go to the top of the page
 
+Quote Post
pragma
post Jul 23 2014, 03:24 AM
Post #7


Running Target
***

Group: Members
Posts: 1,278
Joined: 15-April 05
Member No.: 7,336



I can shed a little light: I have some training in this.

Markov chains are a tool used to model series of random decisions. Consider a goofy example: a drunk guy at a bar walks randomly one step to the left or to the right each second. Five steps to the left is his home, and five steps to the right is a 3 step cul de sac. If you want to figure out how long it will take the guy to stumble home given the kind of weird positions he can get into (stuck looping around the cul de sac, for instance) you use a Markov Chain. This shows up a lot in communication theory (what the research paper is about at heart) because each packet transmitted goes through a few states -- initial transmission, receipt of an acknowledgement or error signal, possible retransmission, etc. -- which are randomly interrupted such that they look like a drunk guy wandering around a cul de sac.

The authors are planning to change how packets are encoded, using something called (I think) 2nd order Galois Field encoding, such that it is easier to catch and correct errors at the receiver. This is interesting because it apparently translates into a higher throughput in the demonstration shown to the reporter in the initial link. However, the tradeoff is that it pushes more computational effort to the receiver which, if it's your cellphone, might not be able to handle the increased power budget. So, nezumi, the right way to think about it is evaluating where the computational effort needs to be: the packet size probably will stay the same or shrink but you'll need to do more math to decode it.

I'm not sure this is really a big story. It's clear the authors charmed the journalist they were talking to, but Arxiv doesn't have peer review so the paper doesn't have the authority of a journal behind it; there could be obvious errors that an internet reporter or a glance from Shadowrun forum members won't catch. Even so, the technology is being popularized by a startup which doesn't necessarily have any traction. I'm in a wait and see mode on this one.
Go to the top of the page
 
+Quote Post
Sendaz
post Jul 23 2014, 07:40 AM
Post #8


Runner
******

Group: Dumpshocked
Posts: 3,039
Joined: 23-March 05
From: The heart of Rywfol Emwolb Industries
Member No.: 7,216



Indeed.

I had heard about something along these lines many years ago, but it never got far back then, so was interested to see if they have made any progress on this.
Go to the top of the page
 
+Quote Post
Shemhazai
post Jul 23 2014, 05:48 PM
Post #9


Moving Target
**

Group: Members
Posts: 598
Joined: 12-October 05
Member No.: 7,835



I recommend this short lecture by Frank Fitzek, one of the study authors, before digging into the study itself.

Frank Fitzek, Aalborg University: Network Coding for Future Communication and Storage Systems

For me, it's fun to speculate that something like this partly underlies a revolutionary change from Internet to Matrix.
Go to the top of the page
 
+Quote Post

Reply to this topicStart new topic

 



RSS Lo-Fi Version Time is now: 10th April 2026 - 06:30 PM

Topps, Inc has sole ownership of the names, logo, artwork, marks, photographs, sounds, audio, video and/or any proprietary material used in connection with the game Shadowrun. Topps, Inc has granted permission to the Dumpshock Forums to use such names, logos, artwork, marks and/or any proprietary materials for promotional and informational purposes on its website but does not endorse, and is not affiliated with the Dumpshock Forums in any official capacity whatsoever.