Nvidia Rides AI Wave to Pass Apple as Largest Company

(bloomberg.com)

119 points | by LopRabbit 241 days ago

12 comments

  • nvidiafanbot 241 days ago
    Apple has an entire diversified product roadmap and ecosystem. Nvidia has a gpu. I don’t see longevity for Nvidia.
    • farseer 241 days ago
      Nvidia also has CUDA, which has surpassed all rival attempts at GPU programming frameworks in the last 2 decades. I remember learning OpenCL and rooting for it to become the standard but couldn't find a single job for that role.
      • cyberax 241 days ago
        CUDA is nice, but it's not a moat. Rocm exists, and even creating a totally new AI computer API is not that far-fetched.
        • _zoltan_ 241 days ago
          CUDA is totally a moat. ROCM exists but AMD is like 100k engineering years behind NVDA just in terms of how much time NVDA had to invest into CUDA and all the ecosystem around it.

          need dataframes and pandas? cuDF. need compression/decompression? nvcomp. need vector search? cuVS. and the list goes on and on and on.

          • bjornsing 241 days ago
            > need dataframes and pandas? cuDF. need compression/decompression? nvcomp. need vector search? cuVS. and the list goes on and on and on.

            Sure, but that doesn’t mean I’m going to pay a billion extra for my next cluster – a cluster that just does matrix multiplication and exponentiation over and over again really fast.

            So I’d say CUDA is clearly a moat for the (relatively tiny) GPGPU space, but not for large scale production AI.

            • erinaceousjones 241 days ago
              > Relatively tiny GPGPU space

              There's a lot of other things which are very GPU parallelizable which just aren't being talked about because they're not part of the AI language model boom, but to pick a few I've seen in passing just from my (quite removed from AI) job:

              - Ocean weather forecasting / modelling - Satellite imagery and remote sensing processing / pre-processing - Processing of spatial data from non-optical sensors (Lidar, sonar) - Hydrodynamic and aerodynamic turbulent flow simulation - Mechanical stress simulation

              Loads of "embarrassingly parallel" stuff in the realms of industrial R&D are benefitting from the slow migration from traditional CPU-heavy compute clusters to ones with GPUs available, because even before the recent push to "decarbonise" HPC, people were seeing the increase in "work done per watt" type cost efficiency is beneficial.

              Probably "relatively tiny" right now compared to the AI boom, but that stuff has been there for years and will continue to grow at a slow and steady pace, imo. Adoption of GPGPU for lots of things is probably being bolstered by the LLM bros now, to be honest.

              CUDA benefits from being early to market in those areas. Mature tools, mature docs, lots of extra bolt-ons, organizational inertia "we already started this using CUDA", etc.

              • bjornsing 241 days ago
                Sure, all that’s interesting and highly realistic. But does it make Nvidia the most valuable company in the world?
          • cyberax 241 days ago
            It's not really 100k years. A ROCm subset good enough to run an LLM can be done within a year or so. You can _technicall_ run it now with tinygrad.

            The silence from AMD has been deafening, though. I can't fathom why they're just ignoring the AI market.

          • HDThoreaun 241 days ago
            There are billions of dollars being funneled into an open source CUDA replacement by all the big tech firms that are pissed about monopoly pricing from nvidia. I just cant see that never working.
        • malux85 241 days ago
          I’ve been using GPU accelerated computing for more than 10 years now. I write CUDA kernels in my sleep. I have seen 10s of frameworks and 10s of chip companies come and go.

          I have NEVER even HEARD of Rocm, and neither has anyone in the GPU programming slack group I just asked.

          CUDA is absolutely a moat.

    • aurareturn 241 days ago
      Apple has the iPhone. It’s very concentrated in the iPhone. You can say services but services are powered by the iPhone as well.
      • bjornsing 241 days ago
        But there are people who owns those iPhones, and many of them care that it’s an Apple iPhone. When you use ChatGPT, do you care if the matrix multiplications were done on genuine Nvidia hardware, or custom OpenAI ASICs?
        • aurareturn 241 days ago
          In gaming, people do care if it's Nvidia or AMD. They want Nvidia.

          For AI, I assume enterprise do care if it's Nvidia. Right now, Nvidia is in the "no one ever got fired for buying Nvidia" camp. You can buy AMD to save a few dollars, run into issues, and get fired.

          • bjornsing 241 days ago
            I’m not questioning Nvidia’s moat in gaming. But that’s pretty much irrelevant to its valuation.

            And sure, for small scale and research activities Nvidia makes sense, even for the long term. But that’s not where the money is at, either.

            The only way Nvidia can sustain its valuation is if it can win in the large scale production AI market long term, and that market turns out to be huge. I don’t really see how that can happen. As soon as NN architectures are stable over the economic lifetime of a chip custom ASICs will make so much more sense: 10x more performance and no Nvidia tax. It’s already happening [1].

            1. https://www.etched.com/

            • aurareturn 241 days ago
              Let's say the Transformer market is 10x bigger in 2035 than in 2025. Nvidia maintains 40% of the entire market, down from the 90% today. They'd still be vastly bigger in 2035 than in 2025.

              >10x more performance and no Nvidia tax. It’s already happening

              Nvidia can do the same though.

              The moat isn't in inference. It's in training.

              • bjornsing 241 days ago
                > Let's say the Transformer market is 10x bigger in 2035 than in 2025. Nvidia maintains 40% of the entire market, down from the 90% today. They'd still be vastly bigger in 2035 than in 2025.

                Sure, but they currently have a P/E ratio of 67… So being vastly bigger in 2035 is not necessarily enough. They have to be enormously bigger and still hugely profitable.

                > Nvidia can do the same though.

                Yes, but then they don’t have that moat.

                > The moat isn't in inference. It's in training.

                I’d say it’s even narrower than that: Nvidias AI moat is in training novel / unforeseen NN architectures. Will that be a meaningful moat in 2035?

                • aurareturn 241 days ago
                  It's really hard to tell. Just all speculative. If you say you don't believe Nvidia is worth $3.5t, and you want to short Nvidia, good luck. But at the same time, I can see why some people don't think Nvidia can't sustain itself.

                  But it all comes down to how good the GPT5 class of LLMs are for the next 2 years.

                  • bjornsing 241 days ago
                    > But it all comes down to how good the GPT5 class of LLMs are for the next 2 years.

                    I guess what I’m saying is that it really doesn’t.

                    But in order to make money on shorts you not only have to be right, you also have to get the timing right. That’s a lot more difficult…

    • paxys 241 days ago
      Take away iPhone and Apple is worth at best 1/10 of its current market cap. The company isn't as diversified as you think.
      • JimDabell 241 days ago
        AirPods alone generate five times as much revenue as Spotify, and about two-thirds as much as Netflix. Apple is also the world’s biggest watchmaker.

        When some of Apple’s side-quests are industry giants in their own right, I think it’s fair to say that Apple are diversified.

        • paxys 241 days ago
          The vast, vast majority of Airpods are sold as an accessory to the iPhone. In fact they stopped including earbuds in the box for this exact reason.
        • littlestymaar 241 days ago
          That's true but at the same time these are mostly I phones accessories. Should the iPhone fall out of fashion (which I don't expect but it's a thought experiment) then the sales of these products will decline as well, so is that really diversification?
          • JimDabell 240 days ago
            Should the iPhone fall out of fashion, Apple will make the Apple Watch compatible with whatever displaces it.

            AirPods are already usable as normal Bluetooth wireless earphones.

        • aurareturn 241 days ago
          Airpods are only popular because of the iPhone.
          • HDThoreaun 241 days ago
            Ive compared airpods to competitors and the airpods were clearly better imo. Sure they became popular because of the iPhone but theyre also just a competitive product.
        • yread 241 days ago
          Ever tried using apple watch without an iPhone?
          • bilbo0s 241 days ago
            Well that's a knife that cuts 3 ways though.

            What I mean is, think about the addressable market comprised of the people who want to use the watch without iPhone.

            I think sometimes we underestimate the number of people out there who do not have an iPhone.

      • aspenmayer 241 days ago
      • r00fus 241 days ago
        Consumer sales vs. B2B sales are different beasts. Absent some titanic shift (e.g. Apple is banned from sales), consumer sales are more durable.

        If AI doesn't pan out, those H100s are not going to find a lot of sales anywhere and Nvidia could be back to a gaming GPU company.

      • lm28469 241 days ago
        It's "only" about 50% of their revenues, which isn't too bad.
        • xethos 241 days ago
          For the iPhone itself, sure.

          Now account for the drop in companion devices (Airpods and AppleWatch work best with an iPhone, and it's pretty apparent when you aren't using one)

          Then take a gander at Cloud services that no longer push themselves via the red badge on every handset, for the photos people are taking on competing platforms (Google Photos is the obvious danger, but even Dropbox through Onedrive pose risks to total revenue)

          Should probably account for App Store revenue as well, considering why most Apple customers buy apps, and then how many buy Macintosh just to develop for the massive iPhone market

          None of these are exclusively tied to the iPhone, but none look nearly as nice without the iPhone being a (and possibly even the) market leader

    • ikrenji 241 days ago
      a gpu is all you need if you’re one of two companies worldwide that makes them
      • ClassyJacket 241 days ago
        Especially if you're the only one that makes GPUs for AI use
    • ksec 241 days ago
      With the profits they are getting now I would have loved Nvidia to diversify into other market with their GPU. Server CPU would be a great opportunity.
    • Dr_Birdbrain 241 days ago
      (deleted because I was wrong)
      • andsoitis 241 days ago
        > Do they even do any R&D anymore?

        Yes.

        “Apple annual research and development expenses for 2024 were $31.37B, a 4.86% increase from 2023. Apple annual research and development expenses for 2023 were $29.915B, a 13.96% increase from 2022. Apple annual research and development expenses for 2022 were $26.251B, a 19.79% increase from 2021”

        https://www.macrotrends.net/stocks/charts/AAPL/apple/researc...

        • monocasa 241 days ago
          R&D spending is basically a lie at most tech companies because of how the tax grants for R&D spending work.
      • mgh2 241 days ago
        Regarding R&D: They have quite a few reports integrating their products with health.

        Ex: https://news.ycombinator.com/item?id=41491121

        https://news.ycombinator.com/item?id=41948739

        Tested by community: https://news.ycombinator.com/item?id=41799324

        https://news.ycombinator.com/item?id=42019694

        VR was a 10 year or so project but agreed no product-market fit yet.

      • lukan 241 days ago
        Could you please not delete your comment in the future, even if you are wrong, but maybe just an edit stating that?

        It would make reading the thread easier ..

      • godelski 241 days ago

          > They seem to have lost faith in their own ability to innovate.
        
        As they should. I mean they can, but they have to change course. All of Silicon Valley has tried to disenfranchise the power users. With excuses that most people don't want those things or how users are too dumb. But the power users are what drives the innovation. Sure, they're a small percentage, but they are the ones who come into your company and hit the ground running. They are the ones that will get to know the systems in and out. They do these things because they specifically want to accomplish things that the devices/software doesn't already do. In other words: innovation. But everyone (Google and Microsoft included) are building walled gardens. Pushing out access. So what do you do? You get the business team to innovate. So what do they come up with? "idk, make it smaller?" "these people are going wild over that gpt thing, let's integrate that!"

        But here's the truth: there is no average user. Or rather, the average user is not representative of the distribution of users. If you build for average, you build for no one. It is hard to invent things, so use the power of scale. It is literally at your fingertips if you want it. Take advantage of the fact that you have a cash cow. That means you can take risks, that you can slow down and make sure you are doing things right. You're not going to die tomorrow if you don't ship, you can take on hard problems and *really* innovate. But you have to take off the chains. Yes, powerful tools are scary, but that doesn't mean you shouldn't use them.

        • andsoitis 241 days ago
          > the average user is not representative of the distribution of users.

          What does this mean? Just thinking about iPhones: As of September 2024, there are an estimated 1.382 billion active iPhone users worldwide, which is a 3.6% increase from the previous year. In the United States, there are over 150 million active iPhone users.

          • godelski 241 days ago
            Are you math inclined? This is easier to explain with math (words) but I can put it in more English if you want.

            If you're remotely familiar with high dimensional statistics, one of the most well known facts is that the density of a normal ball lies on the shell while the uniform ball is evenly distributed. Meaning if you average samples of a normal ball, the result is not representative of the samples. The average is inside the ball, but remember, all the sampling comes from the shell! It is like drawing a straight line between two points on a basketball, the middle of that line is going to be air, not rubber. But if you do for a uniform ball, it is. That's the definition of uniform... Understanding this, we know that users preference is not determined by a single thing, and honestly, this fact becomes meaningful when we're talking like 5 dimensions...[0]. This fact isn't just true for normal balls, it is true for any distribution that is not uniform.

            To try to put this is more English: there are 1.382 billion active iPhone users world wide. They come from nearly 200 countries. The average person in Silicon Valley doesn't want the same thing as the average person in Fresno California. Do you think the average person in Japan wants the same thing as the average Californian? The average American? The average Peruvian? Taste and preference vary dramatically. You aren't going to make a meal that everyone likes, but if you make a meal with no flavor, at least everyone will eat it. What I'm saying is that if you try to make something for everyone, you make something with no flavor, something without any soul. The best things in life are personal. The things you find most enjoyable are not always going to be what your partner, your best friends, your family, or even your neighbor finds most enjoyable. We may have many similarities, but our differences are the spice of life, they are what make us unique. It is what makes us individuals. We all wear different size pants, why would you think we'd all want to put the same magic square in our pockets (if we even have pockets). We can go deeper with the clothing or food analogy, but I think you get that a chef knows how to make more than one dish and a clothing designer knows you need to make more than one thing in different sizes and colors.

            [0] https://stats.stackexchange.com/a/20084

    • mgoetzke 241 days ago
      Apple once had just an iMac.
      • timc3 241 days ago
        When was that? I remember Apple always having multiple products except at the very start.
        • r00fus 241 days ago
          Back in 1998 I think - but that was a reboot when Jobs took over again.

          At the very start they had the Apple1.

    • andsoitis 241 days ago
      > Nvidia has a gpu.

      In addition to the GPUs (which they invented) that Nvidia designs and manufactures for gaming, cryptocurrency mining, and other professional applications, the company also creates chip systems for use in vehicles, robotics, and other tools.

      • Legend2440 241 days ago
        Okay, but $22.6B of their $26B revenue this quarter was from datacenter GPUs.

        The only reason they are this big right now is because they are selling H100s, mostly to other big tech companies.

      • archerx 241 days ago
        Really? Nvidia’s marketing and PR teams are trying to trick people into thinking that they invented GPUs? Does Nvidia have no shame and or scruples?

        The term "GPU" was coined by Sony in reference to the 32-bit Sony GPU (designed by Toshiba) in the PlayStation video game console, released in 1994.

        https://en.wikipedia.org/wiki/Graphics_processing_unit

        • andsoitis 241 days ago
          Before the first major commercial GPU (NVidia’s GeForce 256 in 1999), graphics processing was handled by graphics accelerators or video cards, which had limited functionality and primarily assisted with 2D rendering tasks. Early computers often used framebuffers and simple video adapters to display basic graphics. Complex tasks were often handled by CPU.

          It is true that everything exists on a gradient, but for practical purposes we have to draw the line somewhere and it seems reasonable to me to draw it roughly here.

        • pests 241 days ago
          It really depends on what you want to call a "GPU"

          The device in the PS1 has also been referred to as a "Geometry Transfer Engine"

          You can see it's features and specs here: https://en.m.wikipedia.org/wiki/PlayStation_technical_specif...

          Some may say that it is not a "real GPU" or certain features (like 3d) are missing to make it one.

          The Nvidia claim is for the GeForce 256 released in 99.

          This makes me wonder if our grandkids will be debating on what the first "real AI chip" was - would it be what we call a GPU like the H100 or will a TPU get that title?

          • Slyfox33 241 days ago
            The Geometry Transform Engine is a separate chip. It's a Cpu co-processor, all it does is some basic linear algebra. The Cpu uses it to perform that math faster and then it writes it back to ram where it ships the data off to the actual Gpu, which is something completely separate. (I've written a PSX emulator).
          • archerx 241 days ago
            That is just pedantic and a bit disingenuous. 2D/3D accelerators existed before 1999.

            I actually had one of these cards; https://en.wikipedia.org/wiki/S3_ViRGE

            It sucked but it was technically one of the first "GPU"s.

            Also let us not forget 3dfx and the Voodoo series cards.

            Don't let Nvidia rewrite history please.

            • pests 241 days ago
              I agree, but a more deeper question in my post was when did the old technology evolve into what we call a GPU today?

              I don't want to rewrite history either.

              It's partially telling that you write "2D/3D accelerators" which means that was a different class of thing - if they were a full GPU then you would have called them as such.

              My point being - what defines what a GPU is? Apparently there were things called GTEs, accelerators, and so on. Some feature or invention crossed the line for us to label them as GPUs.

              Just like over the last ~10 years we have seen GPUs losing graphical features and picking up NN/AI/LLM stuff to the point we now call these TPUs.

              Will the future have confusion over the first ~AI CHIP~? Some conversation like

              "Oh technically that was a GPU but it has also incorporated tensor processing so by todays standards it's an AI CHIP."

              • archerx 241 days ago
                > It's partially telling that you write "2D/3D accelerators" which means that was a different class of thing

                It's because that's what they were called at the time. Just because some one calls a rose a different name doesn't mean it doesn't smell the same.

                What defines a GPU? it's a compute unit that process graphics, yes it is that simple. There were many cards that did this before Nvidia.

                An A.I. chip is just a tensor processing unit a TPU, this not that hard to grasp, I think, in my opinion.

                • pests 241 days ago
                  > What defines a GPU? it's a compute unit that process graphics, yes it is that simple

                  But you originally declared the Sony chip as the first GPU. There were many things that processed graphics before that, as you have declared. Apparently going back to the Amiga in the 70s.

                  It is this muddyness with retroactively declaring tech a certain kind that I'm questing here.

                  • vidarh 241 days ago
                    > Apparently going back to the Amiga in the 70s.

                    Development of the Amiga started in 1982, and it was launched in 1985.

                    • pests 241 days ago
                      Oops, meant Atari not Amiga.

                      The Tele-Games Video Arcade was released in 77 and renamed to the Atari 2600 in 82.

        • aithrowawaycomm 241 days ago
          It is technically correct if you take "GPU" to mean "capable of doing 2D and 3D graphics on a single chip." But I think that's hair-splitting to the point of advertisement: the older 3dfx Voodoo was a GPU that handled the 3D math while the video card itself handled the 2D math. And of course that 2D video card itself had a unit which processed graphics! "Both in one chip" is an important milestone but it's pretty arbitrary as a definition of GPU.
          • archerx 241 days ago
            S3 Virge could do 3d and 2d on the same card and is older than the GeForce

            https://en.wikipedia.org/wiki/S3_ViRGE

            • aithrowawaycomm 241 days ago
              Single chip, not single card. The 1997 RIVA 128 really was an integrated chip. But again this is splitting hairs, making a notable technical advancement seem more profound than it really is.
  • zamalek 241 days ago
    The trough of disillusionment is going to be frightening, and I believe we're starting to see the first signs of it (with Apple's paper, ironically enough).
    • lm28469 241 days ago
      ~2 years into the LLM revolution, not a single productivity metric has significantly gone up in any country while we collectively spent hundred of billions on it.

      What really happened: LLMs are fuelling the phishing and scam industry like never before, search engine results are shittier than ever, every other website is an automated LLM generated seo placeholder, every other image is ai generated, students are heavily using these techs for homeworks with unknown long term effects, scientists are using them to write their papers: quality--, quantity++, bots on social media increased dramatically and probably playing a huge role in social tensions. You can probably add a few dozen things in the list

      • RohMin 241 days ago
        It's unfair to dismiss and simplify this technology when it only just started reaching critical mass ~2 years ago. Looking at history, transformative technologies rarely show immediate productivity gains. The internet took nearly a decade from the early 1990s before we saw clear economic impact. Electricity needed 30 years (1890s-1920s) to significantly transform industrial productivity, as factories had to completely redesign their operations. Personal computers were around since the late 1970s, but their productivity impact wasn't measurable until the late 1980s/early 1990s. Major technological shifts require time for society to develop optimal use cases, build supporting infrastructure, and adapt workflows
        • mgh2 241 days ago
          Agreed, but is this technology promise too much to bet on a single company? Are we being sold R&D dreams or tangible products?

          Compare this with firms that had an actual working product (ex: Lightbulb, Internet, Windows, iPhone) and not the mimics of one (ChatGPT, Bitcoin, Metaverse, Nanotechnology, Quantum).

          PS: There is just too much money in the economy now (infused from covid), so chasing speculative investments is expected.

          • RohMin 241 days ago
            Those "actual" working products had little use until infrastructure was built around it to make it useful. The same could definitely be happening to LLMs, we'll just have to wait and see. It is just way too early to claim that they're a mimic of a product.
          • det2x 241 days ago
            People expecting this technology to have such an impact so fast says a lot about the technology
            • mgh2 241 days ago
              Or how good is its marketing and how gullible and greedy people are. Remember big tech control the fundraising narrative through social media
            • theappsecguy 240 days ago
              Not really. A lot of people are happy to take in a metric ton of cash from what is very questionable hype. Nvidia certainly doesn’t mind everyone overselling AI, and neither do CEOs drooling at the thought of AI replacing workers. That’s the reason behind all this insanity.
      • aubanel 241 days ago
        So your point is: 1. The LLM revolution had no significant positive impact so far 2. A dozen areas (phishing, web search, web content, images, education, science, social media) were completely changed by LLMs for the worst

        Your point 2 seems to indicate that the potential for impact is huge. People also think of positive impacts, not only negative ones, that's why they invest.

        • lm28469 237 days ago
          > People also think of positive impacts, not only negative ones, that's why they invest.

          They invest because greedy mfers don't want to miss the next apple or google. They'll throw millions at the walls and see what sticks, remember juicero and theranos ?

      • openrisk 241 days ago
        > we collectively spent hundred of billions on it

        Remember the Metaverse? We are talking about entities that can burn billions on speculative projects and it is just a rounding error on quarterly profits. Let alone that frequently the outlay is in-part recovered via the various "sell shovels in a gold rush" strategies.

        This is what total stranglehold of a critical sector helps with. Inefficiency of capital allocation at interplanetary scale - but that's another story.

        LLM's might be new but the ML/AI type of efficiency/productivity argument is decades old. The domains where it leads to "gains" are generally when the org deploying algos manages not to be held responsible for the externalities unleashed.

        • rangestransform 241 days ago
          The shorttermism of traditional shareholder capitalism is hardly a more preferable outcome

          Companies having more money than god and spending it speculatively at least got us self driving cars

          • openrisk 241 days ago
            Oligopolistic / Competitive vs Short-termist / Long-term are rather orthogonal dimensions and derive from different types of regulations and incentives.
    • ninjin 241 days ago
      Not sure which paper you are referring to. Would you be happy to share it for those of us that seemingly have missed it?
    • kahon65 241 days ago
      Not ready
  • gnabgib 241 days ago
    Discussion (79 points, 12 days ago, 48 comments) https://news.ycombinator.com/item?id=41952389
  • unsnap_biceps 241 days ago
    I really wonder how the future tariffs are going to shape the AI industry. Are we going to see huge AI clusters being hosted overseas? Will it eat into Nvidia's bottom line or will consumers just eat the price increase?
    • doctorpangloss 241 days ago
      People are paying for the software, which is made here in the US, and also not subject to tariffs per se. Same could be said about Apple too - you are buying the iPhone for its software, that is what you are paying for, and the software is made here in the US. The customers, the company and the country enjoy the best surpluses in these scenarios. The tariff is like a glorified tax, it's not going to change much.
      • favflam 241 days ago
        I am guessing countries subject to tariffs will retaliate by making FANG's life hard.
    • seydor 241 days ago
      AI pirates are probably going to be a big thing, but it will not because of tarrifs, they will be running illegal models.
      • patates 241 days ago
        Illegal models! I'm sure there will be more regulation to come but the law has already struggled a lot with the internet and the transactions through it. I wonder how much trial and error this wave will cause.
    • sekai 241 days ago
      Tariffs hike plan will end up the same as Trump's health plan, coming in two weeks, for 4 years.
      • AuryGlenz 241 days ago
        I it’s not like he shied away from them before. I think he’s starting with a high number to force concessions from China, but we’ll still have some amount of more broad tariffs in the end.

        Our health system is so screwed up it’s no wonder they can’t make anything happen. There are only two paths to really fix it, and both would cause a lot of short term issues.

    • gigatexal 241 days ago
      Trump is stupid and insane and a coward. But I don’t think he’s so stupid enough to put such egregious tariffs up. Maybe someone can explain how the economy works to him like a 5-year old and maybe he’ll get it.
      • aurareturn 241 days ago
        I think he mentioned the tariffs to try the election. I don't think he's dumb enough to raise tariffs on every goods which lead to inflation which lead to low approval ratings.
        • surgical_fire 241 days ago
          Not to mention that other countries will also raise tariffs agains US products and services exports. To presume other countries will accept increased tariffs against their industries without retaliation is naive, to say the least.
  • olliej 241 days ago
    I get that they’re selling huge amounts of hardware atm, but I feel like this is entirely due to the hype train that is BS generators.

    I have not encountered any of the aggressively promoted use cases to be better than anything they replaced, and all the things people seem to choose to use seem of questionable long term value.

    I can’t help but feel that this nonsense bubble is going to burst and a lot of this value is going to disappear.

    • devjab 241 days ago
      In document recognition they’re going to replace everything that came before. A couple of years ago you needed a couple of ML experts to setup, train and refine models that could parse through things like contracts, budgets, invoices and what not to extract key info that needed to be easily available for the business.

      Now you need someone semi-proficient in Python who knows enough about deployment to get a local model running. Or alternatively skills to connect to some form of secure cloud LLM like what Microsoft peddles.

      For us it meant that we could cut the work from 6-12 months to a couple of weeks for the initial deployment. And from months to days for adding new document types. It also meant we need one inexpensive employee for maybe 10% or their total time, where we needed a couple of expensive full time experts before. We actually didn’t have the problem with paying the experts, the real challenge was finding them. It was almost impossible to attract and keep ML talent because they had little interest in staying with you after the initial setups, since refining, retuning and adding new document types is “boring”.

      As far as selling hardware goes I agree with you. Even if they have the opportunity to sell a lot right now it must be a very risk filled future. Local models can do quite a lot on very little computation power, and it’s not like a lot of use cases like our document one need to process fast. As long as it can get all our incoming documents done by the next day, maybe even by next week, it’ll be fine.

      • olliej 240 days ago
        I mean I’ve not had any problems with whatever the macOS built in one is, but document recognition is a a tiny part of the industry and isn’t involved in the hype bubble at all.

        The stuff with Python->traing->??->$$$ is what I don’t buy:

        First: the “AI” stuff is “generate content with no obvious financial value to anyone”, chat bots (which no one I know actually seems to want), or “maybe better predictions”.

        Second: the “person can do X with AI with less training” etc is not a value of AI, it’s just a product of improved libraries and UI for putting things together. It doesn’t mean the thing they’re doing with AI has any value outside of bandwagonning.

        Third: the reason for AI start ups is just that training costs a tonne of capital - and VCs love throwing cash at bandwagons so there’s a pile of “AI” startups, all of which offering essentially the same thing below cost in the hopes that they’ll magically find a profit model.

        Finally: there’s already near enough on device processing power on phones for most actual practical uses of “AI” so the need for massive gpu rigs will start to tank especially once the hype train dies off and people start asking what is actually useful in the giant AI startup buzz.

        Each of these things is going to result in the valuation bubble for nvidia collapsing. Mercifully I don’t think there’s any real harm in the nvidia valuation bubble (congrats to the folk who made well on their RSUs!), but I still don’t think the valuation has significant longevity.

  • NoZZz 240 days ago
    A pretty rediculous proposition. The chinese have just improved their chip fabricaiton processes, and a GPU is just a retarded parallelised cpu. Given that most of these innovations are being open sourced, it will be captured by a player that can push down the price.
  • seydor 241 days ago
    Like Tesla but bonkers
    • bitwize 241 days ago
      Need I remind you who runs Tesla? Tesla is already bonkers.
      • imp0cat 241 days ago
        Isn't that a different kind of bonkers though?
  • ChrisArchitect 241 days ago
  • dr_dshiv 241 days ago
    Is it scary that Huawei is competing with both, very quickly?
    • _zoltan_ 241 days ago
      in what sense?

      there isn't any competitor to the MacBook Pro and the M4.

      there isn't any competitor to NVL based Blackwell racks. not even for the H100/H200.

      so how do you think Huawei competes?

      • dr_dshiv 240 days ago
        They are the biggest high-end competitor to the iPhone in China. They have been shut out of nvidia chips so they are rapidly figuring out how to make their own.

        I’m sure both Apple and nvidia view Huawei as a serious competitor. Like, the kind of company that could, in 5 years, take a third of their market share.

  • PittleyDunkin 241 days ago
    "largest" in terms of market valuation; so, not a very substantial measure.
    • aurareturn 241 days ago
      For public companies, market cap is the most important metric. So it’s the most substantial measure to me.
      • Legend2440 241 days ago
        Market cap is kind of made up. The stock is worth what people believe it to be worth.

        Profit is the number that really matters.

        • jraby3 241 days ago
          Profit and expected future profit (based on growth rate) is one of the ways a (non publicly traded) company is valued.

          Theoretically that's how price targets are created by analysts. So market cap is explainable through that metric.

          Profit this year and no growth vs. the same profit this year and double profits next year should lead to different market caps, and it generally does.

        • aurareturn 241 days ago
          >Market cap is kind of made up.

          For a public company, it's all about return on investment, right? Market cap is the most important factor in return on investment right? Therefore, market cap is the most important measure.

          • diffeomorphism 241 days ago
            > Market cap is the most important factor in return on investment right?

            Completely wrong. Hint: You are on HN. Startups obviously don't start at a large market cap.

            • aurareturn 241 days ago
              I'm not wrong. You just interpreted me wrong.

              I didn't say you should invest in companies already with a big market cap. I said market cap increasing is the ROI to investors (besides dividends, which is mostly irrelevant when investing in tech companies).

              That's exactly how startups work. You invest at a cap of $50m. Its market cap increases to $1 billion. You just made 20x (provided you have liquidity).

          • tjoff 241 days ago
            Oh, I should only invest in companies with the biggest market caps?
            • aurareturn 241 days ago
              Sure, if you think they can increase their market caps by the highest percentage and have enough liquidity for you to exit if you wish.
            • basiccalendar74 241 days ago
              you invest in companies that would be biggest market cap in future.
        • YetAnotherNick 241 days ago
          By that logic even dollar value is made up. And profit in dollars too.

          Market cap should be equal to the expected discounted profit.

      • RayVR 241 days ago
        Why is market cap the most important metric? It does not exist in a void. Market cap + P/E or forward P/E, P/B, P/S, etc. are all metrics of high import.

        Market cap is one piece of a complicated picture. On its own, you don't know too much.

        • aurareturn 241 days ago

            Why is market cap the most important metric? It does not exist in a void. Market cap + P/E or forward P/E, P/B, P/S, etc. are all metrics of high import.
          
          I thought it was obvious that I implied market cap increase by percentage.

          When you're evaluating your ROI on tech stocks, the #1 factor is market cap delta between when you bought the shares and now.

        • wbl 241 days ago
          And how much money do you make from trading large cap US equities?
      • itake 241 days ago
        if there are 100 shares in a company, but 2 people own them. person1 has 99 shares. person2 has 1 share. person1 sells their share for $100 to person3.

        Does that mean the market cap is worth $10k? yes. is that meaningful if there are no other buyers at $100? no.

        • lIl-IIIl 241 days ago
          Not sure how this applies bere. It's a publicly traded company, so the price is literally what the in buyers are willing to pay.
          • Ekaros 241 days ago
            Also diversified in ownership. Huang is at less than 4%, big institutions are also at less than 10%. So good chunk of the stock should be some level of liquidity at reasonable premium.
          • itake 241 days ago
            Just because it’s publicly traded doesn’t change anything.

            There might be no buyers. There might be no buyers willing to buy at the last sold price. There might be no sellers willing to sell at the last sold price.

            Market cap would be $10,000, but if there isn’t a single person willing to buy for that price, then is it worth that much,

            • kortilla 241 days ago
              Of course it changes it. The market cap you see is based on up to date trade clearing prices.
            • HDThoreaun 241 days ago
              Nvidia is liquid, you dont have to worry about the stock having no buyers.
        • aurareturn 241 days ago
          Luckily, Nvidia has no such liquidity issues given that it's the most traded stock in the world.

          https://finance.yahoo.com/markets/stocks/most-active/

          So yes, Nvidia's market cap is meaningful. If the reported market cap is at $3.5t and you want to sell your shares, you can easily find someone else who values Nvidia at $3.5t to buy them.

        • ummonk 241 days ago
          It’s a publicly traded company with very high liquidity (tens of billions being traded each day). The market cap is based on the price that other buyers and sellers are bidding. This hypothetical of there being no other buyers simply doesn’t apply.
  • mythz 241 days ago
    Weirdly enough for my next "AI hardware" purchase, I'm waiting for the release of the M4 Ultra to max out on VRAM since Nvidia's chips are highly overpriced and using GPU RAM to price gouge the market.

    Since M4 Max allows 128GB RAM I'm expecting M4 Ultra to max out at 256GB RAM. Curious what x86's best value get consumer hardware with 256GB GPU RAM. I've noticed tinygrad offers a 192 GPU RAM setup for $40K USD [1], anything cheaper?

    [1] https://tinygrad.org/#tinygrad