Wednesday, November 21, 2012

The Browser

WebGl.

Is.

The.

Immediate.

Future.


GoogleEarth in your browser.


Quake 3 in your browser.


The possibilities are endless.

I'm drooling.



19 comments:

Barbarosa said...

Very impressive.

I find it shows how Google has made a smart bet with its Chromebook and Chromebox. The future is in the cloud. The present already is partly.

Also, it further underlines how scared the manufacturers of consumer electronics must be. As processing migrates to server farms and what not, your personal hardware becomes less and less relevant.

Dementor said...

WebGL makes extensive use of the client GPUs.

Barbarosa said...

what about the other instances? Also, isn't it possible that graphical processing be moved away from the user-end?

Dementor said...

I'm not sure what you mean by other instances. Even in the case where all graphics would be processed on the servers (which would imply an unreal amount of stress on the network), all the machines on the farms would have to be equipped with the same GPUs. The cloud doesnt make hardware disappear, it just displaces it.

Dementor said...

The other instances of browser embedded 3D rendering? Like Flash or something? All I know is WebGL is well on its way to become the standard for 3D graphics on the web.

Barbarosa said...

By instances, I meant other shit that needs to be calculated beyond just the graphics. I do not know enough about computers to give you an authoritative example, but what about the AI in a game? Is that processed by the GPU? I would think it's taken care of elsewhere.

As for hardware, I agree that the cloud does not make it disappear. However, in my original point, I did specifically mention consumer electronics, meaning that the hardware core is displaced (to use your verb) from the consumer end to the content producer end.

Dementor said...

I understood your point. In this case the consumers would become the content producer end, and my point is I dont see why the demand for GPUs should diminish. Maybe the price would go down with the disapearance of the gaming enthusiasts compulsive buying factor, but the game servers would still have to spit out those frames.

I think, and CrawMaster might correct me on this, there's already a shift in the modern distributed game software that sees the servers take care of most of the logical crunching, leaving the clients PCs to deal mostly with the graphics handling. For instance, the biggest online shooter right now, PlanetSide2, uses both my CPU and GPU to render graphics. So yeah, you would be right about that, I guess.

Dementor said...

The CPUs are already well too powerful for the normal users, even the extreme gamers. I bought mine 5 years ago and its still kicking ass. They're trying to sell those multiple cores chips to people who really dont need them (normal people on the client side).
On the other side, I upgraded my graphic card only a year ago, and am already thinking about getting a new one.

Dementor said...

... there's a reason why the chromebook comes with a GPU.

Dementor said...

and that won't change, because your idea of frames being transferred on the network is physically impossible, I think, in my humble opinion.

Dementor said...

(physically impossible for highly interactive graphical software, of course)

Dementor said...

then again there's this shit... http://www.nvidia.ca/object/vgx-boards.html

Dementor said...

I know its not impossible for a local network. I'm talking about the Internet.

Barbarosa said...

Wow, the Nvidia VGX looks pretty sweet. Although we'll what the reality is like. Marketing claims aren't worth that much in the real world, as we all know.

Thanks for all the information you shared.

Napoleon Bonerpants said...

Tits

Napoleon Bonerpants said...

Hey,

Speaking of wasted processing. Whatever happened to distributed computing. Wasn't that supposed to be a thing? Ideally, consumer electronics would be used by virtual servers and contribute to the cloud instead of just leeching off of it. Kind of like in the Playstation network, I think. And the economic model could be the same that applies to energy producing homes that get compensated for feeding the grid. Haven't we reached that point? And have we reached real cloud services with real packet level redundance and no single point of failure? I'm testing a lot of new fangled solutions and I'm still not sure. At least, it doesn't seem to be very widespread.

Dementor said...

I think you can lend your processing time to some geeks somewhere so they can listen to aliens.

Barbarosa said...

http://setiathome.ssl.berkeley.edu/

Barbarosa said...

BOINC is a program that lets you donate your idle computer time to science projects like SETI@home, Climateprediction.net, Rosetta@home, World Community Grid, and many others.