• Competitor rules

    Please remember that any mention of competitors, hinting at competitors or offering to provide details of competitors will result in an account suspension. The full rules can be found under the 'Terms and Rules' link in the bottom right corner of your screen. Just don't mention competitors in any way, shape or form and you'll be OK.

Nvidia and Epic show the UE4 powered "Rivalry" demo running on Tegra K1

cool, as long as i can run this for 10 min before the battery runs out...
personaly i think tegra thingy was a big gamble from Nvidia, they are not ready for it, i hope they make it though, because if it doesnt, that's bound to impact desktop, all the R&D spent on this must have been huge
 
I believe the new NVidia Shield 2 will be using the chip.

yea but basicly no one buys the shield, not the 1st and probably not the 2nd, any console like hardware need exclusives, otherwise if it shows that there is a market, you get couple dozen competitors in the following months, driving your margins down and market share, especialy if some of the competition already highly efficient hardware, will either push you to spend more on R&D while losing margins and market share...
i dont know this whole shield thing i dont see it.
what do i know you ask me :d , they know better, just that to me tegra doesnt seem evolving fast enough to be relevant against what's already out there.
 
K1 products:

-MiPad (first one)
-Nexus 9 / Volantis (almost certain, lots of leaked info out there)
-Google Project Tango
-Google Android TV devkit
-Shield 2
-Shield tablet (also almost sure with leaked info out there)
-All the car contracts with audi etc.
 
I just came here to post this:

http://semiaccurate.com/2014/01/20/well-nvidias-tegra-k1-really-perform/

The heatsink on the K1s that Nvidia was showing in private was about 2 x 2 x .4 inches, a bit large for a 5W part don’t you think? In fact it is a bit large for a 28″ AIW device at least as far as Z/height goes, Lenovo can’t cool the 2.3GHz variant in the depth allowed. That should give you a very good idea about how much power a 2.3GHz K1 takes under load.

For the same reasons that the ‘overwhelming performance’ of the Tegra 4 ended up with one phone on the market a year after release, the Tegra 5 will have a similar success story. It can do big number at unspecified wattages, but so can the Apple A7 and the Qualcomm Snapdragon 800. The difference is that neither company needs to show those numbers, their performance in power constrained form factors is more than adequate, not to mention both are on the market today. Nvidia on the other hand doesn’t give out power, but may someday when the device is released

Has this issue been sorted out/ do we know anything more? Fitting a huge heatsink into a tablet isn't viable.
 
I'll never get another device with Nvidia in side. My old phone htc one x had tegra 3 chip. Nvidia stopped supporting it within the second year. Even rooting it, to upgrade or soff nvidia still would not release the source code.

So after that, I won't touch another phone or tablet with Nvidia because bad experience with my old phone.
http://www.phonearena.com/news/HTC-...on-One-X-exploring-update-options-now_id52667
 
Is this running on a real K1 or one of the cheating 10watt heatsync versions like what they did with Tegra 4. Either way not that impressed with K1 specs. Looks to be another hyped underpowered chip.
 
I just came here to post this:

http://semiaccurate.com/2014/01/20/well-nvidias-tegra-k1-really-perform/

Has this issue been sorted out/ do we know anything more? Fitting a huge heatsink into a tablet isn't viable.

It was never an issue in the first place. Charlie just wrote his usual stuff about nvidia without any care for facts.

What you saw in those demos is an automotive dev kit for K1, that's the sort of HS it's supposed to run in the car world.

For better numbers check jetson TK1 whitepaper as well as performance numbers from the xiaomi mipad.

Completely feasible and the best GPU performance mobile soc atm. Delivers ~50% more GPU perf than apple A7 at the same power draw.
 
For better numbers check jetson TK1 whitepaper as well as performance numbers from the xiaomi mipad.

Completely feasible and the best GPU performance mobile soc atm. Delivers ~50% more GPU perf than apple A7 at the same power draw.
That is not true at all. It’s not even out yet and it’s slower then what we have today in products. Its specs in areas are low.

There is no evidence it produces 50% more GPU performance then Apple A7 at the same power level. A t least none that I have seen. The only time it has beat A7 that I have seen was a rigged benchmark when it had x5 more power which wouldn’t even fit in a phone.

That and the A7 just got x10 faster due to Metal and by the time TK1 hits market it won’t even be against the A7 but next generation which is far higher spec and faster.

TK1 is shaping up to be the slowest of the next gen GPU’s.
 
That is not true at all. It’s not even out yet and it’s slower then what we have today in products.

TK1 is shaping up to be the slowest of the next gen GPU’s.

I would like to see your figures for this, TK1 should be a lot better (GPU wise) when comparing to Qcom S800/S801 (what all current android flagships run) and should even better the new Qcom chips around the corner.
 
I would like to see your figures for this, TK1 should be a lot better (GPU wise) when comparing to Qcom S800/S801 (what all current android flagships run) and should even better the new Qcom chips around the corner.
The G6230 Rogue GPU is around 1 year old and a match for Tegra K1.
http://www.ubergizmo.com/2014/06/onda-v989-tablet-smashes-48000-antutu-mark/

The G6230 a single core GPU. By the time the Tegra K1 is out the dual core Rogue chips will be around. Some of the year old G6230 specs are x3 higher then Tegra K1. Surly that means the dual core chip will have come specs x6 higher.

That and Tegra chips have a long history of cheating. The first benchmark run are fast but after the 4th or 5th run it has lost around 30% performance. It cannot maintain the speed past 5 minutes. This has been confirmed time and time again. While the G6230 Rogue stays at full speed after 5 minuets.


http://wccftech.com/nvidia-tegra-k1...-mipad-ship-32bit-64bit-denver-powered-chips/
the T K1 peaked at 11.06W to get those benchmarks. The only reason it scored so high was because it cheated. 11watt drain!!! Just how fast do you think a G6230 could be if it used up 11watt
 
Last edited:
Back
Top Bottom