Personally I find quantum computers really impressive, and they havent been given its righteous hype.
I know they won’t be something everyone has in their house but it will greatly improve some services.
Trough of disillusionment
You think we’ve made it that far?
Approaching the point of disillusionment.
They started to work, but hardly anyone cares. They are still far from being good, or affordable.
I think we’re still headed up the peak of inflated expectations. Quantum computing may be better at a category of problems that do a significant amount of math on a small amount of data. Traditional computing is likely to stay better at anything that requires a large amount of input data, or a large amount of output data, or only uses a small amount of math to transform the inputs to the outputs.
Anything you do with SQL, spreadsheets, images, music and video, and basically anything involved in rendering is pretty much untouchable. On the other hand, a limited number of use cases (cryptography, cryptocurrencies, maybe even AI/ML) might be much cheaper and fasrer with a quantum computer. There are possible military applications, so countries with big militaries are spending until they know whether that’s a weakness or not. If it turns out they can’t do any of the things that looked possible from the expectation peak, the whole industry will fizzle.
As for my opinion, comparing QC to early silicon computers is very misleading, because early computers improved by becoming way smaller. QC is far closer to the minimum possible size already, so there won’t be a comparable, “then grow the circuit size by a factor of ten million” step. I think they probably can’t do anything world shaking.
As for my opinion, comparing QC to early silicon computers is very misleading, because early computers improved by becoming way smaller. QC is far closer to the minimum possible size already, so there won’t be a comparable
Thanks for saying this. I see a lot of people who assume all technology always gets better all the time. Truth is, things do have limits, and sometimes things hit a dead end and never get better than they are. Those things tend to get stuck in the lab and you never hear about them.
sometimes things hit a dead end and never get better
Ah, that’s when it’s time to start charging a monthly subscription fee of course!
Pretty much on the blue line. They cost a lot of money for being barely functional, and it’s not clear whether they’ll ever be anything more
Quantum computers don’t lie: it’s not like thawed can run generative ai
I think AI is falling into disillusionment and Quantum Computers feel at least 10 years behind.
AI is falling into disillusionment for like the 10th time now. We just keep redefining what AI is to mean “whatever is slightly out of reach for modern computers”.
Hahaha, I kept saying this to myself while going through this thread. I mean there is a whole wiki page on the concept of AI winters because it’s such a common occurrence - https://en.m.wikipedia.org/wiki/AI_winter
deleted by creator
Amazing computational speedups if you regularly use any of these incredibly specific algorithms. Otherwise useless.
Quantum as a service may exist as a business.
Uh… one of those algorithms in your list is literally for speeding up linear algebra. Do you think just because it sounds technical it’s “businessy”? All modern technology is technical, that’s what technology is. It would be like someone saying, “GPUs would be useless to regular people because all they mainly do is speed up matrix multiplication. Who cares about that except for businesses?” Many of these algorithms here offer potential speedup for linear algebra operations. That is the basis of both graphics and AI. One of those algorithms is even for machine learning in that list. There are various algorithms for potentially speeding up matrix multiplication in the linear. It’s huge for regular consumers… assuming the technology could ever progress to come to regular consumers.
literally for speeding up linear algebra
For a sparse matrix where you don’t need the values of the solution vector.
I.e. a very specific use case.
Quantum computers will be called from libraries that apply very specific subroutines for very specific problems.
Consumers may occasionally call a quantum subroutine in a cloud environment. I very much doubt we will have a quantum chip in our phone.
Yes, but, quantum TPM or TPU chips would allow for far more complex encryption. So you’d likely have a portiion of the SOC with a quantum bus or some other function.
However you’re correct that it’d take a seachange in computing for a qbit based OS
Strong, post quantum encryption doesn’t require quantum computers. It uses different mathematical objects (e.g. matrices)
True. However there is still a usecase. You could sign a cert for uefi much like a payment would. Useful for distributed compute.
Why are you isolating a single algorithm? There are tons of them that speed up various aspects of linear algebra and not just that single one, and many improvements to these algorithms since they were first introduced, there are a lot more in the literature than just in the popular consciousness.
The point is not that it will speed up every major calculation, but these are calculations that could be made use of, and there will likely even be more similar algorithms discovered if quantum computers are more commonplace. There is a whole branch of research called quantum machine learning that is centered solely around figuring out how to make use of these algorithms to provide performance benefits for machine learning algorithms.
If they would offer speed benefits, then why wouldn’t you want to have the chip that offers the speed benefits in your phone? Of course, in practical terms, we likely will not have this due to the difficulty and expense of quantum chips, and the fact they currently have to be cooled below to near zero degrees Kelvin. But your argument suggests that if somehow consumers could have access to technology in their phone that would offer performance benefits to their software that they wouldn’t want it.
That just makes no sense to me. The issue is not that quantum computers could not offer performance benefits in theory. The issue is more about whether or not the theory can be implemented in practical engineering terms, as well as a cost-to-performance ratio. The engineering would have to be good enough to both bring the price down and make the performance benefits high enough to make it worth it.
It is the same with GPUs. A GPU can only speed up certain problems, and it would thus be even more inefficient to try and force every calculation through the GPU. You have libraries that only call the GPU when it is needed for certain calculations. This ends up offering major performance benefits and if the price of the GPU is low enough and the performance benefits high enough to match what the consumers want, they will buy it. We also have separate AI chips now as well which are making their way into some phones. While there’s no reason at the current moment to believe we will see quantum technology shrunk small and cheap enough to show up in consumer phones, if hypothetically that was the case, I don’t see why consumers wouldn’t want it.
I am sure clever software developers would figure out how to make use of them if they were available like that. They likely will not be available like that any time in the near future, if ever, but assuming they are, there would probably be a lot of interesting use cases for them that have not even been thought of yet. They will likely remain something largely used by businesses but in my view it will be mostly because of practical concerns. The benefits of them won’t outweigh the cost anytime soon.
I’m so dreadfully sorry. I cannot help myself. Please forgive me.
It’s “zero kelvins” not “zero degrees Kelvin.”
You don’t have to be sorry, that was stupid of me to write that.
Why are you isolating a single algorithm?
To show that quantum computing only helps with very specific parts of very specific algorithms.
A QC is not a CPU, it’s not a GPU, it’s closer to a superpowered FPU.
If they would offer speed benefits, then why wouldn’t you want to have the chip that offers the speed benefits in your phone?
if somehow consumers could have access to technology in their phone that would offer performance benefits to their software that they wouldn’t want it.
Because the same functionality would be available as a cloud service (like AI now). This reduces costs and the need to carry liquid nitrogen around.
The issue is not that quantum computers could not offer performance benefits in theory.
It is this. QC only enhances some very specific tasks.
It is the same with GPUs. A GPU can only speed up certain problems. You have libraries that only call the GPU when it is needed for certain calculations.
Yes, exactly my point. QC is a less flexible GPU.
I don’t see why consumers wouldn’t want it.
Because they would need to use the specific quantum enhanced algorithms frequently enough to pay to have local, always on access.
They will likely remain something largely used by businesses but in my view it will be mostly because of practical concerns. The benefits of them won’t outweigh the cost anytime soon.
Agree. Unless some magic tech, like room temperature superconductors, turns up there will only be quantum as a service supplied for some very specific business needs.
Because the same functionality would be available as a cloud service (like AI now). This reduces costs and the need to carry liquid nitrogen around.
Okay, you are just misrepresenting my argument at this point.
Actually I think we are mostly agreeing.
The difference is that you think that the technology will quickly be made cheap and portable enough for mass consumption and I think it will remain, for quite some time, niche and expensive, like high end, precision industrial equipment.
Domt fkrget about quantum tunneling past the activation barrier
Quantum computers are now where neural nets were in the 1980s.
Good reference to compare with, but any sources?
It’s just an opinion
Difficult thing to guess as we’re seeing exponential growth lot more
The answer for that exists as a superposition of multiple possibilities but as soon as somebody manages to read it it will decohere into just the one.
Somewhere around 0,0 or 1,1
There are amazing possibilities in the theoretical space, but there hasn’t been enough of a breakthrough on how to practically make stable qubits on a scale to create widespread hype
Quantum computers have already had its hype, so plateau of productivity. It’s just that the plateau is really low.
There is a difference between feasibility hype and adoption hype. The hype about it being possible at all has passed. But the true hype relevant to the graph is when it is implemented in the general economy, outside of labs and research facilities.
Yeah they’re similar to fusion. The hype perpetually goes up to the first peak and then back down to the left while they keep working on it
This is the equivalent of saying AI already had its hype because Isaac Asimov was popular.
People are aware of the term quantum computer and basically nothing else. We’re a decade pre-hype at least. Only a small handful of specialists are investing in it.
The picture only shows one hype cycle. AI has been through multiple hype cycles. Same will happen with quantum computers, once a new major breakthrough is reached.
There hasn’t been anything resembling a hype cycle for quantum computing.
All points on that curve, at the same time just now, for undefined values of now.
I personally think we’re on the slope of enlightenment - quantum computing no longer attracts as much hype as it used to, but in the background, there’s a lot of interesting developments that genuinely might be very important.
I’d agree, but that slope will be a long and hard one. And the hype cycle may have many more peaks and troughs of disillusionment, from new breakthroughs, but the researchers will still make steady progress.
If true then when did QC have its “ChatGPT” moment?