Archives by Day

About Rainier

PC gamer, WorthPlaying EIC, globe-trotting couch potato, patriot, '80s headbanger, movie watcher, music lover, foodie and man in black -- squirrel!

Advertising

As an Amazon Associate, we earn commission from qualifying purchases.





NVIDIA Acquires AGEIA Technologies

by Rainier on Feb. 4, 2008 @ 2:08 p.m. PST

NVIDIA announced that it has signed a definitive agreement to acquire AGEIA Technologies. More details will be revealed during NVIDIA's quarterly results next week Wednesday.

"The AGEIA team is world class, and is passionate about the same thing we are—creating the most amazing and captivating game experiences," stated Jen-Hsun Huang, president and CEO of NVIDIA. "By combining the teams that created the world's most pervasive GPU and physics engine brands, we can now bring GeForce®-accelerated PhysX to hundreds of millions of gamers around the world."

"NVIDIA is the perfect fit for us. They have the world's best parallel computing technology and are the thought leaders in GPUs and gaming. We are united by a common culture based on a passion for innovating and driving the consumer experience," said Manju Hegde, co-founder and CEO of AGEIA.

Like graphics, physics processing is made up of millions of parallel computations. The NVIDIA® GeForce® 8800GT GPU, with its 128 processors, can process parallel applications up to two orders of magnitude faster than a dual or quad-core CPU.

"The computer industry is moving towards a heterogeneous computing model, combining a flexible CPU and a massively parallel processor like the GPU to perform computationally intensive applications like real-time computer graphics," continued Mr. Huang. "NVIDIA's CUDA™ technology, which is rapidly becoming the most pervasive parallel programming environment in history, broadens the parallel processing world to hundreds of applications desperate for a giant step in computational performance. Applications such as physics, computer vision, and video/image processing are enabled through CUDA and heterogeneous computing."

blog comments powered by Disqus