Qualcomm Snapdragon X Elite - looks like the Windows world’s answer to Apple Silicon

his is what I am saying, how much more bullshit are you gonna take just because it's "better"? right now its onboard memory/SSD, with iPhones its apple and their proprietary charging cable/not allowing after market displays (heck, they dont even allow you to swap displays from a donor phone), how much more will it take for peeps to realize we are the one getting screwed in the end? This is how precedents are set, how market starts shovelling more bullshit down our throats. How HP gets away with selling proprietary ink and not allowing after market ink to be used in their printer, how you need a subscription to still use the ink they bought from HP and this is just one example. I can quote countless others. that's why I say its copium, just because it's "better" doesnt mean we should entertain the bullshit that OEMs force down our throats.
Arent you assuming two things in a bollywood 80s movie Black/White vein viz:

- Customers are idiots
- Corporations are outright evil

In the example of proprietary ink that you quoted, there are two facets that you are disregarding:

1) locked inkjets are disproportionately cheap for the customer
Proprietary Ink model was adopted to create units that could be sold at cost or even at loss.
That makes it a positive proposition for households that print infrequently as many such households are ok with a high variable cost per print.. but not OK with a upfront high cost

(Inkjets are oft available for 2.2 -2.5K excl GST. Once you factor in channel costs, shipping & duty , the proceeds for manufacturer would be to the tune of <1000 INR. Inkjets are complex mechanical devices and the only way they sell so cheap is because of proprietary ink)

2) SMB or affluent households are targeted with Ink Tank units or Lasers. And HP (as well as pretty much every other printer brand) makes ink tanks too.
A basic ink tank costs between 8-10K excl GST even though there isnt much of a manuf cost difference between this and a normal ink jet. Given the product complexity of a printer, this would seem to be a fair price . Yet most households still buy locked inkjets.

Barring outright monopolies or oligopolies, I think you will find it hard, if not impossible to find examples of a company being able to get away with what you are assuming to be the norm
 
Last edited:
And TBH , the only person it affects is you.
No it affects everyone. I am in the same boat. Their hardware is good but restricted by their software. If the same was sold by someone else I could use that but not till it is being marketed by them
you get what you pay for, its as simple as that. Why do you want comparable Android flagships or ARM laptops to be cheaper? The point is not about how expensive it is, its about how anti-consumer Apple's practices are. it depends on how much you want to stay in a walled garden, I prefer not being restricted so I'l; never buy an iPhone, other peeps may prefer the walled garden and iPhone is ideal for them. its that simple. You want premium support/updates? cough up the money for that.
This is bullseye. People use 10k android phone and compare it to 2l iphone. Use s23u or s24u and you will throw away the restricted iphone garbage, unless you are too biased or caught up in buyer's remorse trying to eternally justify the apple tax you paid
 
you get what you pay for, its as simple as that. Why do you want comparable Android flagships or ARM laptops to be cheaper?
I don't. I paid a lot more for my droid than i paid for my iphone .
This was just to demonstrate that a product built with an emphasis on quality always costs more

Although the general perception around here often seems to be that macbook/iphone/ipad whatever are overpriced - catch is that would make pretty much every premium laptop or android overpriced too

No it affects everyone. I am in the same boat. Their hardware is good but restricted by their software. If the same was sold by someone else I could use that but not till it is being marketed by them

This is bullseye. People use 10k android phone and compare it to 2l iphone. Use s23u or s24u and you will throw away the restricted iphone garbage, unless you are too biased or caught up in buyer's remorse trying to eternally justify the apple tax you paid
Which brings me back to what triggered this originally. What software restriction , if any have you faced on Mac os?
I could in fact counter-argue that windows is more restrictive in comparison
 
Very good. This thread too has turned into 'Top 10 reasons to hate Apple, number 4 will blow your mind' mansplaining thread.

To those who think those who buy Mac are idiots, we buy Macs or iPhone because we like them and it fits our needs. You have to come out of this mindset that people should buy products that you approve.

This thread is about Snapdragon Elite X, keep it about Snapdragon Elite X and Windows on ARM.
Okay, I'll nitpick a bit here, though I'll preface and say I'm not an expert on this, I just had a some curiosity after a couple of classes. It's not "x86" that runs hot, and it's neither "ARM design" that's more efficient.
I have Dell Latitude 5430 powered by i5-1254U (work laptop) and M2 Pro MBP (personal). When I do compilation on this in VS or using maven, Latitude gets bloody hot. Though it has max TDP of 55W, it runs hot. Though 5430 is supposed to be better, it runs slower in real world usage simply because of the heat and throttling. You wont believe, we barely get 2-3 hour battery life with these new Intel based laptops. Before the Latitude, I had the Yoga 370 with dual core 7th gen chip. It never ran hot as it had dual core i7 chip and has amazing battery life. It got terribly slow as applications and OS moved beyond dual core chips. My MBP M2 Pro one runs lot cooler when I do similar compilations and has much much better battery life.

After using both platforms extensively, I would love to have Windows on ARM. x86, by its architecture limitations can never be as efficient as RISC architecture and can never run in a tightly packed chassis without throttling. I am also saying this because early in my career, I was Linux/Unix admin and I loved those Sun SPARC servers. Bloody things were insanely efficient (RISC architecture) and would run for months without a need to reboot. Last one we used was UltraSPARC III and IV. I could not believe that something that ran under 1GHz would easily beat x86 server CPUs and not sweat at all. Back then, Intel was brainwashing that more MHz means more speed.

They are just targeting different niches. Intel and AMD both target the enterprise, where power simply doesn't matter as much. ARM has been targeting mobile devices from the start.
Battery life matters a lot. We spend lot of time collaborating, attend lot of meeting. Now we see more and more engineers and managers carrying power cables to meeting rooms and we do not have enough ports for all. This much need for carrying power cables never happened in the past. If I can take my MBP to work (not allowed), I do not have to charge till i come back to home. My current Wintel laptop need to be charged twice ( I reach office by 8:30 and leave around 4 ). if I do not plug it in.

There are so many in my office who do not want to upgrade to these new Intel based laptops simply because they have horrendous battery life for productivity. They want to stick to their 5 year old laptops and upgrade battery pack if required. As we are not allowed to use laptop that is out of warranty, we are forced to upgrade. We cannot go to those Evo ultrabooks as they have amazing battery life but are inferior at compiling code, running browser with dozens of tabs etc. This is where ARM comes in as a saviour. You get powerful laptops that are not battery guzzlers.
 
Last edited:
I have Dell Latitude 5430 powered by i5-1254U (work laptop) and M2 Pro MBP (personal). When I do compilation on this in VS or using maven, Latitude gets bloody hot. Though it has max TDP of 55W, it runs hot. Though 5430 is supposed to be better, it runs slower in real world usage simply because of the heat and throttling. You wont believe, we barely get 2-3 hour battery life with these new Intel based laptops. Before the Latitude, I had the Yoga 370 with dual core 7th gen chip. It never ran hot as it had dual core i7 chip and has amazing battery life. It got terribly slow as applications and OS moved beyond dual core chips. My MBP M2 Pro one runs lot cooler when I do similar compilations and has much much better battery life.

After using both platforms extensively, I would love to have Windows on ARM. x86, by its architecture limitations can never be as efficient as RISC architecture and can never run in a tightly packed chassis without throttling. I am also saying this because early in my career, I was Linux/Unix admin and I loved those Sun SPARC servers. Bloody things were insanely efficient (RISC architecture) and would run for months without a need to reboot. Last one we used was UltraSPARC III and IV. I could not believe that something that ran under 1GHz would easily beat x86 server CPUs and not sweat at all. Back then, Intel was brainwashing that more MHz means more speed.


Battery life matters a lot. We spend lot of time collaborating, attend lot of meeting. Now we see more and more engineers and managers carrying power cables to meeting rooms and we do not have enough ports for all. This much need for carrying power cables never happened in the past. If I can take my MBP to work (not allowed), I do not have to charge till i come back to home. My current Wintel laptop need to be charged twice ( I reach office by 8:30 and leave around 4 ). if I do not plug it in.

There are so many in my office who do not want to upgrade to these new Intel based laptops simply because they have horrendous battery life for productivity. They want to stick to their 5 year old laptops and upgrade battery pack if required. As we are not allowed to use laptop that is out of warranty, we are forced to upgrade. We cannot go to those Evo ultrabooks as they have amazing battery life but are inferior at compiling code, running browser with dozens of tabs etc. This is where ARM comes in as a saviour. You get powerful laptops that are not battery guzzlers.
I agree on all points. Battery life matters to me a lot too, and is why I'm holding out on purchases and will directly buy an ARM one when they become available. I just wanted to point out that x86 could also be a lot more power efficient, just that Intel and AMD are not that interested in making it so. Sure, it can't be as efficient as a RISC processor, but it can come close. Modern x86 instruction set is still CISC, but the modern CPU architecture itself is closer to RISC. That's why I said it was a nitpick and not anything you should take as an example. After all, it's wildly different from what we observe.

By the way, I've gotten the chance to work on PowerPCs, and I loved them too! Not because of CPU Instruction or anything though, I don't work on that low level mostly, but just because of the novelty of it and how open all of it is!

In any case, I think we're going very off-topic, and I'm sorry about that, but I love talking about these. It's fun, and I always get something wrong lol, which others teach me about. I love that! :D
 
it's not the on-board memory, its the unified arch which has common memory for ram/gpu, its basically the same thing as a PS5 or Xbox and aside from gaming and maybe video editing. you wont feel a difference in daily usage. get any windows laptop with a decent processor and a ssd, you wont feel any difference in it unless you literally start measuring loading times with a stopwatch,
No, it is not that simple. Unified memory is not just same memory for CPU and GPU. From I remember, in a Mac
  • SSD and memory controllers are on the SoC.
  • Memory is on the SoC, right next to the CPU and GPU cores.
  • SSD and SoC are connected via high bandwidth channels.
  • As SSD can deliver data instantaneously, OS need not keep large amount of data in memory. This is why Apple says that 8GB on Mac is like 16GB on PC (I do hate 8GB Macs that are sold for 1L or more).
Does it help in real world performance? It did work really well when Apple started migration to M1. Application load time, compilation and rendering was so much more faster. Apple Mx chips showed clear difference in speeds 'in real world usage'. It's a win either way for users. Those who won't need lot of power will get really long battery life and the laptop can be ridiculously thin and lightweight. Those who want lot of power will get that power and it will not be at the cost of battery life. Those who want extraordinary amount of power can still go with Wintels but those laptops need like 300W or more power to operate at that level and battery won't even last an hour.

Coming back to unified memory. Usually, RISC needs more memory to operate as it can only load one instruction per execution. As memory needs to be loaded more number of times, I believe Apple came up with this idea of unified memory architecture and ways to reduce time taken to load data to memory from SSD and then from memory to CPU cores. This is just my thought.
 
Last edited:
Qualcomm has just gone official with both the X Elite and X Plus. This is turning out to be a killer SOC. I can't wait for the actual laptops, The Surface 10 has already leaked. More should be coming out soon. Intel and AMD are probably crying right now.

 
No, it is not that simple. Unified memory is not just same memory for CPU and GPU. From I remember, in a Mac
  • SSD and memory controllers are on the SoC.
  • Memory is on the SoC, right next to the CPU and GPU cores.
  • SSD and SoC are connected via high bandwidth channels.
  • As SSD can deliver data instantaneously, OS need not keep large amount of data in memory. This is why Apple says that 8GB on Mac is like 16GB on PC (I do hate 8GB Macs that are sold for 1L or more).
Does it help in real world performance? It did work really well when Apple started migration to M1. Application load time, compilation and rendering was so much more faster. Apple Mx chips showed clear difference in speeds 'in real world usage'. It's a win either way for users. Those who won't need lot of power will get really long battery life and the laptop can be ridiculously thin and lightweight. Those who want lot of power will get that power and it will not be at the cost of battery life. Those who want extraordinary amount of power can still go with Wintels but those laptops need like 300W or more power to operate at that level and battery won't even last an hour.

Coming back to unified memory. Usually, RISC needs more memory to operate as it can only load one instruction per execution. As memory needs to be loaded more number of times, I believe Apple came up with this idea of unified memory architecture and ways to reduce time taken to load data to memory from SSD and then from memory to CPU cores. This is just my thought.
Apple's 2TB SSD on their top model tops out at 6,000 MB/s or 6 GB/s.

The DDR5 6000 Mhz dual channel ram on my desktop easily hits 70+ GB/s.

That is almost 12x faster in just sequential throughput, probably much faster for random reads/writes. Not to mention RAM latencies are measured in nanoseconds, while an nvme ssd's latency is measured in microseconds. So they are at least a factor of 1000 times slower.

All this is to say, even a very fast SSD is not going to make much of a difference when you run out of system memory.
 
That is almost 12x faster in just sequential throughput, probably much faster for random reads/writes. Not to mention RAM latencies are measured in nanoseconds, while an nvme ssd's latency is measured in microseconds. So they are at least a factor of 1000 times slower.

All this is to say, even a very fast SSD is not going to make much of a difference when you run out of system memory.
This statement is true and has been proven so many times by various YouTubers. Mac throttles when it runs out of RAM. Majority of users who use a Mac don't do Memory intensive tasks, So nobody really notices it.
 
It's not just about the battery. It will now enable the Manufacturers to make fan less ventless design just like a Mac. To me the battery life and the fanless design is more than enough to be an early adopter.
Fanless design makes m2 onwards Macbook air throttle. Pro still has fan. Not sure if systems with x elite would be fanless. Maybe with x plus.
 
Apple's 2TB SSD on their top model tops out at 6,000 MB/s or 6 GB/s.

The DDR5 6000 Mhz dual channel ram on my desktop easily hits 70+ GB/s.

That is almost 12x faster in just sequential throughput, probably much faster for random reads/writes. Not to mention RAM latencies are measured in nanoseconds, while an nvme ssd's latency is measured in microseconds. So they are at least a factor of 1000 times slower.

All this is to say, even a very fast SSD is not going to make much of a difference when you run out of system memory.
This statement is true and has been proven so many times by various YouTubers. Mac throttles when it runs out of RAM. Majority of users who use a Mac don't do Memory intensive tasks, So nobody really notices it.
Even I hate an 8GB memory Mac. It is preposterous to even think of 8GB memory in today's laptop. I just mentioned that it is due to this architecture that Apple is able to twist the narrative. Here is my M2 Pro MBP memory usage. Even if you take out 4GB of cache, I am at 8GB when I am just browsing and reading. There is no way I would say 8GB is good.

Instead of derailing the topic into 'why Apple is doing what it is doing?' again and again, suggest you folks to stick to the topic.

1713978591076.png

Fanless design makes m2 onwards Macbook air throttle. Pro still has fan. Not sure if systems with x elite would be fanless. Maybe with x plus.
M2 does not throttle, M3 does for sure. I have the iPhone 15 Pro Max which is powered by A17 Pro. Both M3 and A17 Pro are based on same 3nm fab. When iPhone 15 Pro/Max came out, lot of phones were overheating and Apple released a patch to reduce peak performance. Current thermal issues seem to be because of 'yet to mature' 3nm fabrication. TSMC had lot of challenges to bring 3nm into production and yield was very low.
 
Even I hate an 8GB memory Mac. It is preposterous to even think of 8GB memory in today's laptop. I just mentioned that it is due to this architecture that Apple is able to twist the narrative. Here is my M2 Pro MBP memory usage. Even if you take out 4GB of cache, I am at 8GB when I am just browsing and reading. There is no way I would say 8GB is good.

Instead of derailing the topic into 'why Apple is doing what it is doing?' again and again, suggest you folks to stick to the topic.
Sure.
  • As SSD can deliver data instantaneously, OS need not keep large amount of data in memory. This is why Apple says that 8GB on Mac is like 16GB on PC (I do hate 8GB Macs that are sold for 1L or more).
But this part of your post is entirely false and it can lead people astray when purchasing a laptop. Mac or not. That's why I corrected it.


It looks like there are some spicy rumors going around.


I don't see why Qualcomm would be incentivized to lie about performance when on day one of launch, there would be plenty of people who would run benchmarks on it and call them out. And while I can see them missing performance targets by some fraction, 50% lower or 'celeron' level is just an unrealistic claim. So I would take this with a large grain of salt.
 
Last edited:
Even I hate an 8GB memory Mac. It is preposterous to even think of 8GB memory in today's laptop. I just mentioned that it is due to this architecture that Apple is able to twist the narrative. Here is my M2 Pro MBP memory usage. Even if you take out 4GB of cache, I am at 8GB when I am just browsing and reading. There is no way I would say 8GB is good.

Instead of derailing the topic into 'why Apple is doing what it is doing?' again and again, suggest you folks to stick to the topic.

View attachment 195374

M2 does not throttle, M3 does for sure. I have the iPhone 15 Pro Max which is powered by A17 Pro. Both M3 and A17 Pro are based on same 3nm fab. When iPhone 15 Pro/Max came out, lot of phones were overheating and Apple released a patch to reduce peak performance. Current thermal issues seem to be because of 'yet to mature' 3nm fabrication. TSMC had lot of challenges to bring 3nm into production and yield was very low.
Here:
, m2 fanless throttles on sustained load.
 
Qualcomm has just gone official with both the X Elite and X Plus. This is turning out to be a killer SOC. I can't wait for the actual laptops, The Surface 10 has already leaked. More should be coming out soon. Intel and AMD are probably crying right now.


Stoked for these chips to superpower Windows finally!

Prize for anyone who can accurately count number of "AI" in the segment. :D

Also, they tried very hard to copy Apple's presentation style, but WTF are they wearing - weird wrinkled shirt jacket sneaker combo which looks doubly bad and amatuer with wide angle lens.
Man, Apple has nailed these presentations even though they look very clinical now - but BTS is crazy.
 
Here:
, m2 fanless throttles on sustained load.
Please stop falling for 'This CPU throttles because it cannot run Cinbench multiple consecutive runs'. That my friend, is clickbait. The same will happen for Snapdragon X Plus as well. If it works without a hiccup, reviewers will go on and on and on about how good they are for video rendering, boast ridiculously high Cinebench score. Then they will keep running Cinebench and say 'hey this chip is throttling'. When you see that kind of fire and sad face as thumbnail and it is about a chip that no one is having problem with, run. Because it is CLICKBAIT!

In real world, M2 does not throttle. This is what throttling is and one of the prime reasons Apple switched to ARM design. The new generation MBP 2018 laptop runs slower than past gen laptop at real world tasks that it is intended for (Rendering in premier pro) :


Come into the real world and M2 does not have any problem at all. The M3 Air though heats up in real world. But is M3 Air running slower than M2 Air? No. Apple bit more than they can chew this time with M3 Air. They went after 'M3 air is 1234234322434% faster than M2 air' though that was un-necessary. They wanted to show drastic increase in performance with M3 (like M1 -> M2, like Intel -> M1) and that backfired a bit.

The real problem though is this:

Screenshot 2024-04-25 at 8.28.38 AM.png


This is from the review of M3 MacBook Air. I have seen so many saying that M3 Air can sail through ray tracing workload blah blah. M3 Air can do this but M3 Air is not meant for doing all this stuff mentioned in third paragraph. Many reviewers/influencers give wrong advice/recommendation and people end up buying the wrong laptop for their work. M2 MacBook Air never throttles in real world usage. It throttles only when you put it to tasks that it is not intended to do.

MacBook Air : home use, ideal business laptop, simple audio/video editing. Air is not meant for heavy workloads that keeps CPU at max.
MacBook Pro: Ideal productivity laptop. Anything that keeps your Air at 100% CPU for extended periods of time, run it on Pro. Anything that keeps your Pro at 100% CPU for extended period of time, run it on Mac Pro.
Mac Pro: Heavy workloads that runs for really long time.

PS: I am not going to discuss further on this nitpicking.
Stoked for these chips to superpower Windows finally!

Prize for anyone who can accurately count number of "AI" in the segment. :D

Also, they tried very hard to copy Apple's presentation style, but WTF are they wearing - weird wrinkled shirt jacket sneaker combo which looks doubly bad and amatuer with wide angle lens.
Man, Apple has nailed these presentations even though they look very clinical now - but BTS is crazy.
I am super excited as well. Cannot wait to see them in real world. Now that you mentioned about presentation style, I cannot unsee it now! It does look ridiculous. Kedar looks like a hobbit in wide angle. Is it too early to say 'RIP Intel'?

This one is absolutely misleading though. That Intel chip can hit 115W.
Screenshot 2024-04-25 at 9.24.17 AM.png

This is going to be the defining factor though:
Screenshot 2024-04-25 at 9.26.45 AM.png
1714017502430.png


This is a pretty decent start from MS and covers most needs of home users. Hope app devs take the platform seriously this time and make native apps.

1714017614308.png
 
Last edited:
Please stop falling for 'This CPU throttles because it cannot run Cinbench multiple consecutive runs'. That my friend, is clickbait. The same will happen for Snapdragon X Plus as well. If it works without a hiccup, reviewers will go on and on and on about how good they are for video rendering, boast ridiculously high Cinebench score. Then they will keep running Cinebench and say 'hey this chip is throttling'. When you see that kind of fire and sad face as thumbnail and it is about a chip that no one is having problem with, run. Because it is CLICKBAIT!

In real world, M2 does not throttle. This is what throttling is and one of the prime reasons Apple switched to ARM design. The new generation MBP 2018 laptop runs slower than past gen laptop at real world tasks that it is intended for (Rendering in premier pro) :


Come into the real world and M2 does not have any problem at all. The M3 Air though heats up in real world. But is M3 Air running slower than M2 Air? No. Apple bit more than they can chew this time with M3 Air. They went after 'M3 air is 1234234322434% faster than M2 air' though that was un-necessary. They wanted to show drastic increase in performance with M3 (like M1 -> M2, like Intel -> M1) and that backfired a bit.

The real problem though is this:

View attachment 195394

This is from the review of M3 MacBook Air. I have seen so many saying that M3 Air can sail through ray tracing workload blah blah. M3 Air can do this but M3 Air is not meant for doing all this stuff mentioned in third paragraph. Many reviewers/influencers give wrong advice/recommendation and people end up buying the wrong laptop for their work. M2 MacBook Air never throttles in real world usage. It throttles only when you put it to tasks that it is not intended to do.

MacBook Air : home use, ideal business laptop, simple audio/video editing. Air is not meant for heavy workloads that keeps CPU at max.
MacBook Pro: Ideal productivity laptop. Anything that keeps your Air at 100% CPU for extended periods of time, run it on Pro. Anything that keeps your Pro at 100% CPU for extended period of time, run it on Mac Pro.
Mac Pro: Heavy workloads that runs for really long time.

PS: I am not going to discuss further on this nitpicking.

I am super excited as well. Cannot wait to see them in real world. Now that you mentioned about presentation style, I cannot unsee it now! It does look ridiculous. Kedar looks like a hobbit in wide angle. Is it too early to say 'RIP Intel'?

This one is absolutely misleading though. That Intel chip can hit 115W.
View attachment 195395
This is going to be the defining factor though:
View attachment 195396 View attachment 195397

This is a pretty decent start from MS and covers most needs of home users. Hope app devs take the platform seriously this time and make native apps.

View attachment 195398
The reviewer literally says it doesn't happen for daily tasks. He just compared m1 vs m2 air. And m2 clearly scores less than m1 after sustained loads. If person is gaming on windows there should definitely be fans, for productivity and light coding it should be fine.
 
The reviewer literally says it doesn't happen for daily tasks. He just compared m1 vs m2 air. And m2 clearly scores less than m1 after sustained loads. If person is gaming on windows there should definitely be fans, for productivity and light coding it should be fine.
1714018891693.png


This is from the same video. M2 after run 8 of Cinebench is faster than M1 at run 1 of Cinebench. I am not sure what made you say that m2 'clearly' scores less than M1. Now, can we please get back to Snapdragon X Elite. I would suggest you to go to an Apple Store and try out Air. Try out the Pro as well. You will find zero difference in normal usage. Even on the Pro, fan won't turn on. I have been browsing and replying for past 1 hour or so and till now, even on the Pro, fan did not kick in. But the moment I start using DaVinci Resolve, fan kicks in. This is something that I would not run on Air. Hope you get my point.
 
The X Elite looks promising!

I wonder what AMD/Intel will launch next to compete. They'll definitely compete on the more efficient end, as can be seen from Steam Deck or such devices. Intel Ultra looks pretty good too. I am so excited to see how this pans out.
However, I think in the next few years, the market will be really turbulent if you're looking to buy a new device :P

Will X Elite also come as ITX mobos, like N100 does? Or will PC users only get Intel/AMD for the foreseeable future? I guess Qualcomm will try to come for PC users too, I wonder how they'll react. Perhaps a hybrid approach where the SoC+Mobo already has some amount of “unified memory” and DDR5 external memory support? Perhaps NUCs that are priced more fairly and in smaller, more compact bodies? Imagine Lenovo M70 series sizes having same I/O but so much more performance. :D

I hope they introduce some lower-cost lower-power chips too, like the Snapdragon 700 series after some years. And don't mess up the naming scheme while doing so lol

How would it be priced? I suppose if it costs too much, I'd probably stay with AMD given my current laptop works really well and 8h is “good enough” for me, and it operates mostly fanless too.
 
Will X Elite also come as ITX mobos, like N100 does? Or will PC users only get Intel/AMD for the foreseeable future? I guess Qualcomm will try to come for PC users too, I wonder how they'll react. Perhaps a hybrid approach where the SoC+Mobo already has some amount of “unified memory” and DDR5 external memory support? Perhaps NUCs that are priced more fairly and in smaller, more compact bodies? Imagine Lenovo M70 series sizes having same I/O but so much more performance. :D
I doubt if any SI/OEM will bring such mini PCs till platform matures (2-3 years at least). A lot is hinging on Microsoft right now. If they mess it up again, ARM on Windows is dead forever.
 
This is from the same video. M2 after run 8 of Cinebench is faster than M1 at run 1 of Cinebench. I am not sure what made you say that m2 'clearly' scores less than M1. Now, can we please get back to Snapdragon X Elite. I would suggest you to go to an Apple Store and try out Air. Try out the Pro as well. You will find zero difference in normal usage. Even on the Pro, fan won't turn on. I have been browsing and replying for past 1 hour or so and till now, even on the Pro, fan did not kick in. But the moment I start using DaVinci Resolve, fan kicks in. This is something that I would not run on Air. Hope you get my point.
Guess i am stating the obvious .. but Majority of influencer reviewers are thought through right from the inception stage to invite the max number of clicks rather than any kind of altruistic thought process to help others make the right decision.

A video in 2024 titled M2 works very well for most users will probably attract a tenth of clicks as against the rather enticing title this one has :tearsofjoy:

@jayanttyagi - i still have the og M1 air that will be hitting the 4 year mark later this year.
After the Air, i got a 16” M1 Pro as a family PC that runs l docked and wife got a M3 pro 14” a few months ago

To the point that others have also made, they all perform pretty much the same for non professional usage. Other than the screen grade/ form factor, it would be hard to tell between the 3 on normal (albeit heavy ) usage of multiple outlook / word/ excel/ safari tabs (say 30+)
Obv i (or anyone) wouldn’t/shouldn’t use a ultra thin & light like the M1 air (or M2) for the kind of workloads cinebench tests for - thats just common sense

A product review this late in the lifecycle that is negative is actually a good sign for a prospective buyer - and i hope the same happens with X Elite :smile:
On a side note, I think many of the older folks here would miss the old web 2.0 days .. SM (and now Gen AI) are truly killing the internet
 
Back
Top