• Competitor rules

    Please remember that any mention of competitors, hinting at competitors or offering to provide details of competitors will result in an account suspension. The full rules can be found under the 'Terms and Rules' link in the bottom right corner of your screen. Just don't mention competitors in any way, shape or form and you'll be OK.

The Financial Results Thread

The shine is already coming off GPT4 judging by this thread: https://news.ycombinator.com/item?id=36134249

Looks like there's broad consensus, it's getting less impressive rather than more as OpenAI try to scale it out. Might signal the hype cycle is already starting to die down a bit.

I paid for the plus version and have found it overly the same as the free version tbh. Plus/gpt 4 is a bit more accurate in it's answers but as always with this, to get the best from it, you have to ask the questions in the right way and provide some context to what you are aiming to achieve as well as probe it a bit if you need more info/clarity. At least this is the case with software development orientated questions I ask it. Based on my own usage, the best thing by far is using it to assist on problems (literally talking about saving hours worth of reading to troubleshoot issues) and to give me a base on where to start. The main weakness of this tech. for now is it can't apply best practices, if you ask it to use best practices, sure, it can probably come up with something but alas, since the data it uses is only up to September 2021, it will be a bit behind but then again, there are still a lot of companies stuck in the 2000s with their tech.......

Ultimately chatgpt is really nothing more than just a fun tool and personal assistant, most companies aren't wanting to use this directly, rather they are wanting their own versions for internal use because of data privacy and cyber attack concerns i.e. we can't use it with any sensitive data because of our company policy where nothing can leave our network so are currently working with microsoft on our own internal version.
 
Last edited:
I paid for the plus version and have found it overly the same as the free version tbh. Plus/gpt 4 is a bit more accurate in it's answers but as always with this, to get the best from it, you have to ask the questions in the right way and provide some context to what you are aiming to achieve as well as probe it a bit if you need more info/clarity. At least this is the case with software development orientated questions I ask it. Based on my own usage, the best thing by far is using it to assist on problems (literally talking about saving hours worth of reading to troubleshoot issues) and to give me a base on where to start. The main weakness of this tech. for now is it can't apply best practices, if you ask it to use best practices, sure, it can probably come up with something but alas, since the data it uses is only up to September 2021, it will be a bit behind but then again, there are still a lot of companies stuck in the 2000s with their tech.......

Ultimately chatgpt is really nothing more than just a fun tool and personal assistant, most companies aren't wanting to use this directly, rather they are wanting their own versions for internal use because of data privacy and cyber attack concerns i.e. we can't use it with any sensitive data because of our company policy where nothing can leave our network so are currently working with microsoft on our own internal version.
Yeah. My experience with it is that it's more of a UX improvement over googling for programming queries. Less diving in and out of different links to find good examples.

At the end of the day I need to read code more than I have to write it and GPT is far better at providing solutions that exist in its training data than novel solutions (which are typically confidently wrong). This actually severely limits its usefulness for working on an established closed source code base with years of novel domain knowledge and legacy built into it.

So it's like, it definitely has value. But it's more incremental than people are imagining, I think.
 
The shine is already coming off GPT4 judging by this thread: https://news.ycombinator.com/item?id=36134249

Looks like there's broad consensus, it's getting less impressive rather than more as OpenAI try to scale it out. Might signal the hype cycle is already starting to die down a bit.

Gate keeping for plebs, if anyone can create art, write a book or complex code then no one can.

When they took away the ability for the ordinary man to earn a living from selling his labour they told him "learn to code, pleb" now the pleb can do precisely that they don't like it.
 
Gate keeping for plebs, if anyone can create art, write a book or complex code then no one can.

When they took away the ability for the ordinary man to earn a living from selling his labour they told him "learn to code, pleb" now the pleb can do precisely that they don't like it.
Not to get too political in a GPU forum, but labour is priced as a product of supply and demand like anything else. Probably the focus there should be less on GPT and more on making the world a nicer place to live in as a "pleb".

Otherwise if we can automate a bunch of labour, produce more stuff and somehow make the world worse for a lot of people then we've sort of ****ed up somewhere I think.
 
Not to get too political in a GPU forum, but labour is priced as a product of supply and demand like anything else. Probably the focus there should be less on GPT and more on making the world a nicer place to live in as a "pleb".

Otherwise if we can automate a bunch of labour, produce more stuff and somehow make the world worse for a lot of people then we've sort of ****ed up somewhere I think.

Its quite obviously being nerfed from having its own opinions because that leads to it being politically incorrect.

But that's not really the point. Read some of the comments in the link.

Anyway, on the politics, farmout what labour you can to china, import on mass what you can't to keep wages down. Globalism, in a nutshell.
 
Last edited:
Not to get too political in a GPU forum, but labour is priced as a product of supply and demand like anything else. Probably the focus there should be less on GPT and more on making the world a nicer place to live in as a "pleb".

Otherwise if we can automate a bunch of labour, produce more stuff and somehow make the world worse for a lot of people then we've sort of ****ed up somewhere I think.

shhh Labour theory of value - capitalism volume 1 :P
 
Its quite obviously being nerfed from having its own opinions because that leads to it being politically incorrect.

But that's not really the point, Read some of the comments in the link

Anyway, on the politics, export what labour you can to china, import on mass what you can't to keep wages down.
On the politically incorrect side, probably, yeah. This whole thing hinges on optics so it's not surprising that they would want to mitigate the risk of it appearing in any way incompatible with corporate money, especially the silicon valley liberal types (strongly "new left" social views but economically right and loaded).

But yeah I have read a few of the comments. I'm interested in the people who seem to think that it's giving them worse answers in general even where it doesn't relate to politics or opinions. I'm inclined to believe the people who seem to be speculating that this is possibly down to trying to scale the thing to the demand they're seeing.

I know some of their earlier optimisation attempts were thwarted as a cache was leaking info between different users' sessions for instance which is a huge security flaw. So maybe dumbing down the inference is another attempt, but it eats directly into GPT4's value proposition.
 
It use to be a PE ratio of 70 was considered high risk, Nvidia was hovering around 200+ very recently. Absolutely insane valuation, given the volatility of the market s the trade in and reliance on a single manufacturer.
 
x200 is just crazy pure speculation then!

Either the market would have to grow like crazy, Nvidia would have to milk even more than they currently do, or share traders are just speculating like crazy. If the market were to grow like crazy, then as AI has few Patent Walls (currently) other players will enter. And while GPUs and Nvidia's tools may be good for training, for using existing models custom far more fixed ASIC will dominated.

There is no way all those other monster $billion companies will just sit back and pay Nvidia's rates if for a bit of planing they can have their own team doing their own custom ASICs.
 
Seems like they're losing quite a bit of their momentum this week. Down 5% so far today. Think people are losing their appetite to "buy at the peak" since this is about as high as market caps tend to go.
 
Kind of get the feeling there's to much money in stocks like Nvidia to allow it to fail, the ratios just make no sense.

For 2023 Nvidia turned over $27b and made a net profit of $4.2b. There share price as of right now is 390.96 per share with 2.47b of outstanding shares available, giving an earnings per share of $1.7 each, divide the share price by the earnings per share and the ratio is 231 (for scale a company like Tesco it has a PE ratio of 27, HSBC is 7, Meta/Facebook 27 all of whom make similar or bigger profits then Nvidia).

For Nvidia's current valuation to make sense they would need to increase their revenue base of $27b to $121b in 5 years (by 2028) and maintain a 15.5% net profit margin (BTW that assumes the share price remains static which realistically it won't) which will generate net profits of around $18.7b per year.

I don't know how long investors think it will take Nvidia to achieve those goals, but do they really price shares that far in advance? Given a lot of hyperscalers are building out their own custom solutions seems kind of risky but then this goes back to my first point about being to high to allow it to fail.
 
Last edited:
Back
Top Bottom