Where Do Quantum Computing Play Into Everything?

<p>I have heard that quantum computing is still mostly being developed by physics majors. Where do electrical engineering go into this realm? What about computer science majors?</p>

<p>The field is still largely theoretical, and probably won’t be taught readily in CS or ECE. I know that there are done researchers either at MIT or Stanford working on ion traps as a method for computing on an atomic scale.</p>

<p>Sent from my SAMSUNG-SGH-I897 using CC App</p>

<p>I disagree with Presidont’s points.</p>

<p>I think the most forward-looking IT programs will soon start to have a greater emphasis on quantum computing skills - not a complete emphasis soon, as it takes a while for new technologies, even revolutionary ones, to make their way into standard curricula. I’m quite certain that QC will be here sooner rather than later. There are already innumerable approaches to QC out there and several models have already been built, but are still rudimentary. That’s exactly how computers started out, but with tubes; they were bulky and extremely basic. But advancements in the area just 20 years later made computers that were able to do far more complex computations, like basic dialogue systems (ELIZA, for example), Russian-English translations, business data management, etc. (this was in the 60s). Transistors came along and since then Moore’s Law has been pretty consistently predicted the advancements made in computing for decades.</p>

<p>But Moore’s Law is supposedly only going to apply, given the current paradigm of computing, until about 2020. At that point we’ll have reached atomic-level structures for computing. The only way that Moore’s Law can continue to apply is if we develop technologies that fall under the ‘steep part’ of an exponential curve, where there’s runaway computational power. And that’s exactly what QC is: atomic-level computing that creates runaway computational power. Recent major discoveries in memristors further prove that the move in computing is reaching the atomic level. Development of ballastic transistor deflection models or other terahertz processing models have begun to hit that atomic boundary too.</p>

<p>Further, Ray Kurzweil’s predictions based on the “law of accelerating returns,” which has a long history of evidence in support, also indicate that QC will be here sooner than we think.</p>

<p>So the forward-thinking schools like Stanford and MIT will undoubtedly be placing a greater emphasis on developing systems and algorithms to harness the power of QC. CMU has already implemented curricular changes that emphasize parallel computation (which has grown significantly in recent times and is well-acknowledge to be the future of computers), and that’s the sort of change that will be necessary in placing more emphasis on QC: creating systems and software that will take advantage of the immense parallel computation that QC harnesses, *exponential *parallelism.</p>

<p>Some say that QC are only useful for X, Y, and Z problems and don’t do much better in other areas of computation, but those people have this myopic view that QC will come along and tackle the same problems in the same way that we’ve implemented the problems for decades. But that’s the whole point of a revolutionary change in computing: you reformulate how you tackle everything, including the problems themselves. It’s what happened when transistors came long. And we’re long overdue for another revolutionary change in how we view computation.</p>

<p>So to answer your questions, electrical engineers will be instrumental in physically implementing the discoveries made in quantum research. They will be necessary to create the actual systems that pipeline the quantum mechanisms to the broader computing system (integrating it with memory, circuits, visual displays, and the like).</p>

<p>Computer scientists are just as important, in that they will create the operating systems and software that make use of this hardware. As a small example, hyper-threading was a great advancement in technology under the current paradigm of transistors, but it requires software that can actually manipulate and take advantage of this technology (in fact, you need to disable the hyper-threading feature for OS’s that can’t handle it). More than anything, QC will revolutionize how we learn and use computer science, especially the algorithms we devise currently, in order to fit this new paradigm of QC.</p>

<p>One thing is for sure: matter is perfectly capable of representing and manipulating data. At the convergence of chemistry and physics, we will be able to use single atoms to accomplish this task of data processing (and thus are born nano-computers). These advancements are on the horizon. We need these physical scientists to understand the properties and mechanisms of matter in order for us to take advantage of the laws that govern their operation. Once they do, we need electrical engineers to create the systems that will make their practical use possible. And we will need computer scientists to design the right algorithms to take advantage of these systems.</p>

<p>(Obviously, I’m something of a QC advocate, but I’m not a futurist in any sense of the word; I just call it as I see it.)</p>

<p>Well…phantasmagoric basically said everything I was going to say, plus more.</p>

<p>Understanding quantum computing is not like doing 1+1. The basic concepts is not difficult. It’s just like learning new things. But scientists and mathematicians have been spending their life times trying to get more out of quantum. We still have very limited understanding of quantum.</p>

<p>As a matter of fact, D-Wave sold its first commercial quantum computer to a company at 10 million dollars I think. It’s a ground-breaking because previously quantum computers are owned by government (so they are customised, and not commercially available). </p>

<p>On the other hand I am not so much into quantum computing. It is a promising technology for the distant future. From what I have read control the states is extremely difficult as states can be disturbed by anything in its surrounding, and when you disturbed a state you lose the track. </p>

<p>Quantum computing is an interdisciplinary field (I mean come on which field isn’t interdisciplinary).</p>

<p>I’m a bit wary of [D-Wave’s[/url</a>] quantum computer, in which they claim they have a 128 qubit processor. After all, physicists only recently reached a [url=&lt;a href=“http://progressivelever.com/2011/04/05/physicists-entangle-a-record-breaking-14-quantum-bits/]record”&gt;http://progressivelever.com/2011/04/05/physicists-entangle-a-record-breaking-14-quantum-bits/]record</a> of 14 qubits](<a href=“http://en.wikipedia.org/wiki/D-Wave_Systems]D-Wave’s[/url”>D-Wave Systems - Wikipedia) in an entangled state. If you read the Wiki article, one professor notes that they never even proved the workings of this machine.</p>

<p>So one question: do quantum computers count in trinaries? Well, everyone says they have 3 states: true, false and both. I know it is not simple as that, but it kind of blows my mind. How can an particle be observed without changing its state? I thought it would be knocked out of place when an another particle collides with it.</p>

<p>I’m also wary of D-Wave being a “true” quantum computer, not just because lots of people in the know doubt it, but because they supposedly created a 128 qubit machine. That would be astronomical in the first place, which would make me doubt it, but given the exponential nature of using qubits, 128 qubits would give you such computational power, tasks that are run on it would perform much better than they actually are. Also, if it really were a 128-qubit QC, there would be much, much more talk about it.</p>

<p>But at least we’re on our way; a 14-qubit QC alone would give you great computational power, and we’ve shown it’s possible. Sure, decoherence is a big problem, and quantum is still little understood, but at least we’ve understood enough to implement rudimentary systems.</p>

<p>

</p>

<p>I don’t really know much about quantum computers, but maybe this’ll help? From [Wikipedia](<a href=“http://en.wikipedia.org/wiki/Quantum_computer]Wikipedia[/url]:”>Quantum computing - Wikipedia):</a>

</p>

<p>

</p>

<p>You’ve identified the two biggest problems in QC. For the latter, we need to manage decoherence, which is caused when the environment messes with the qubits. For the former, there are different approaches, but I think the basic idea is to entangle two atoms and then to observe one of them (preserving the other), thus being able to manipulate data without destroying it. I’m not a physics person, so my knowledge of its intricacies is limited, but as far as I’ve read, these are the two biggest problems in QC.</p>

<p>Me neither. I don’t think D Wave actually sold a real quantum computer. If it does, it would turn down Intel and AMD.</p>

<p>

</p>

<p>From wikipedia

[Quantum</a> annealing - Wikipedia, the free encyclopedia](<a href=“http://en.wikipedia.org/wiki/Quantum_annealing]Quantum”>Quantum annealing - Wikipedia)</p>

<p>Basically, a simulation. </p>

<p>From D-Wave’s released document:</p>

<p>

</p>

<p>When I say ground-breaking, I am talking about the commercial part.</p>

<p>

</p>

<p>davidthefat, because of this quote I’d take what phantasmagoric says with a grain of salt. It’s easy to be super optimistic about technology when you don’t understand how it works.</p>

<p>

</p>

<p>What? Hyper-threading isn’t a a great advancement in technology and strictly speaking, has nothing to do with transistors. It’s a particular feature on the computer architecture level. An Intel processor with hyper-threading has hardware support for multiple threads. Roughly speaking this means that when your computer runs multiple programs, the processor has features in hardware that allow for the sharing of the processor between the programs.</p>

<p><a href=“Obviously,%20I’m%20something%20of%20a%20QC%20advocate,%20but%20I’m%20not%20a%20futurist%20in%20any%20sense%20of%20the%20word;%20I%20just%20call%20it%20as%20I%20see%20it.”>quote</a>

[/quote]
</p>

<p>When you cite Ray Kurzweil and write walls of text predicting the groundbreaking impact of quantum computers without actually being a ‘physics person’, I’d say that you are a futurist.</p>

<p>

</p>

<p>I’m not “super optimistic” about it - even those who are doing research in the field, or who aren’t but know how it works quite well (being physicists), agree with me. You also don’t need to know physics deeply to understand the advancement. For example, most of the people who write quantum algorithms from computer science know very little about physics. Indeed, it’s well-known that you don’t need to know any physics to be able to write quantum algorithms. Of course, that’s different from helping advancements in the hardware itself, but it’s entirely possible to understand the hardware to a great extent without knowing the dirty details that only QC researchers will actually know (I doubt you even understand these details).</p>

<p>If you look at the advancements in QC, you’ll see that the number of qubits that researchers have implemented since 1998 has steadily doubled every few years, though with less consistency than Moore’s Law. It’s not unreasonable to be optimistic about QC.</p>

<p>You say to take my comments with a “grain of salt” because my knowledge of the intricacies is limited (how that precludes me from providing accurate statements on QC on the whole is a mystery, to me and to QC researchers alike), but you fail to show how my answers to davidthefat’s questions - about the role of computer scientists and electrical engineers in QC - are wrong. Are you saying that they have no role in QC? That they won’t be important in implementing QC beyond the laboratory?</p>

<p>

</p>

<p>We can disagree all day about how “great” an advancement things like hyper-threading are, but in the end you’re picking at semantics about what “great” means (many would say “this is great” while others say it isn’t). And I didn’t say it had to do with transistors, just that it’s “under the current paradigm of transistors”: you don’t see hyper-threading in QC or vacuum tubes, do you? The rest of what you say here isn’t contradicting my point, which is that you need software to be able to take advantage of the hardware.</p>

<p>Notice that I said “as a small example” - I wasn’t intending for hyper-threading to be a perfect mirror of the issue at hand, just a demonstration that new hardware advances require advances in software.</p>

<p>

</p>

<p>Notice that I only mentioned Ray Kurzweil in passing, because I don’t agree with most of what he says. Most of what I wrote also doesn’t talk about the “groundbreaking impact” - please re-read my post, which has comparatively little gushing about the impact of QC. I’m not a futurist because I don’t predict groundbreaking technology across every area (another reason why citing Ray Kurzweil doesn’t support the notion of me being a futurist), which if you weren’t aware is what futurists do. Rather, I see one area of research that I think is extremely promising and likely to be the future of computing. Many, if not most, authorities on the subject agree, but that doesn’t mean they’re futurists. Picking out some research and saying “this will be very important” does not a futurist make. If that’s the case, virtually everyone in STEM fields are futurists, because they all can point to one area of research that they’re reasonably confident will yield significant advances and be revolutionary to their field (or society).</p>

<p>I disagree with the notion that computer scientists and electrical engineers in industry will be doing anything with quantum computing in the next 10 years, or even 20 years. There are a couple of reasons, one of the most important being that engineering means making a product and making it profitable, as well as making it in an efficient manner. That is going to take a really long time, so there won’t be that many opportunities to work in it until the industry’s grown a bit (for some time anyway). Remember, this is after physicists figure out the best way to actually make a quantum computer, so with that it’s even longer. That being said, it’ll be a quickly growing market, so it’s worth being optimistic about it.</p>

<p>I’m assuming this post refers to the average computer scientist and electrical engineer, with a B.S. only. Currently only CS/physicists Ph.D’s are working on quantum computing, and it’ll be a very long time before anyone with a B.S. will be able to work on them. I suspect that they’ll only exist in the research realm for another few decades, so if the question is meant to ask, ‘When can I expect to work as an EE on quantum computing in industry after I get my B.S.?’, then I’d say you better look at some other options because that won’t be around for a really long time.</p>

<p>^ electrical engineers already do work on QC. Remember that there are a huge variety of different approaches, and various rudimentary systems have been implemented already, from teams composed mostly of physicists but also computer scientists and electrical engineers. You’re right, though, that someone with a B.S. isn’t likely working on it now, and even eventually PhDs in those areas will be the ones doing most of the work in QC. It’s hard to say just how far off these sorts of budding technologies are; sometimes they burgeon quickly (like the transistor), others take forever and still aren’t great (like AI). IMO it’s somewhere between 10 and 20 years that QC will really take off.</p>

<p>Right. I was talking about EE’s and CS degree holders with only a B.S. (I did mention those with Ph.D’s).</p>

<p>

</p>

<p>But the dirty details are what matters. Correct me if I’m wrong, but it isn’t obvious whether useful machines can be built yet. That is the bottleneck in quantum computing. So the people who know a lot about building these machines really are the only people who can gauge progress made on quantum computing well.</p>

<p>

</p>

<p>Reading your post more carefully, I guess your answer to his question is correct should it be discovered soon that useful quantum computers can be made. There wasn’t much reasoning why this should be the case though. You indicated that there is a need for a new technology for continued progress in computing, and you appealed to a prediction that some guy, who holds religious views regarding technology, made about technological progress in general.</p>

<p>

</p>

<p>Correct me if I’m wrong, but the number of qubits in these very rudimentary computers is on the order of 1-10 now, right? Isn’t it a little silly to call this exponential growth?</p>

<p>My old roommate jokes about this with his chest hairs. He went from 1 to 2 chest hairs in 18 months and from 2 to 4 in the next 18 months.</p>

<p>

</p>

<p>What I was trying to point out is that saying that ‘hyper-threading is under the current paradigm of transistors’ doesn’t mean a lot. Saying that sounds to me like you don’t know what you are talking about.</p>

<p>

</p>

<p>You were predicting that these systems would be successful and in general use to warrant including in electrical engineering and computer science curricula.</p>

<p>

</p>

<p>You should probably make an argument why quantum computing will be successful without appealing to Ray Kurzweil’s hope for accelerated progress in all technological areas then, if you don’t want to sound like a futurist.</p>

<p>

</p>

<p>How are you coming up with this number?</p>

<p>

</p>

<p>They have been built, and the people who know a lot about building them have gauged the progress and believe that it’s on the horizon. How do you not know anything about this? Have you even looked up QC and what researchers continually discuss?</p>

<p>At least now I can see just how much you know about progress in QC.</p>

<p>

</p>

<p>His question is on the role of computer scientists and electrical engineers. Their role is not dependent on time; it’s not as though they would be relevant if a scalable QC was built tomorrow versus in 20 years. Their role is the same. The OP did not ask about whether he would be relevant to it, or anything of the sort; but how CS and EE people play into it.</p>

<p>

</p>

<p>So tell me why I’m wrong and why it is that CS and EE people are not relevant. You still haven’t shown that.</p>

<p>

</p>

<p>Again: he was mentioned in passing, not as an argument, not as a legitimate point, but a short point in case the OP was familiar with Kurzweil or would like to read what he has to say. So you can stop reading so much into it.</p>

<p>

</p>

<p>You really have no idea where QC is now, do you? You can’t even be bothered to look up what the latest QC has been (instead just guessing “it’s between 1 and 10, somewhere in there”). And please point out where I said it was exponential growth. I never said anything of the sort.</p>

<p>Rather, I implied it’s starting to show, though with less consistency, Moore’s Law: it’s hard to gauge these sorts of progress when they are in their infancy, but it’s worthwhile to note that every few years (though not 18 months), the number of qubits has been doubling. In 12 years, we’ve gone from 1-2 qubits to 14.</p>

<p>

</p>

<p>Again, you read so much into a small, peripheral comment and then draw the conclusions that you want to draw (in other words, a confirmation bias). Since I was showing contrasts between the development of QC and the developments made under transistors, I was intentionally being explicit about which one the development falls under, to show the parallel.</p>

<p>Believe me, I know how it works (I’m finishing in CS at Stanford now and starting my PhD in CS at MIT in the fall). You are obviously bent on dismissing me as knowing less than you, as you’ve focused on peripheral points, misinterpreted others, etc. despite the fact that you clearly haven’t bothered to learn anything about progress in QC. I trust that you can read about it, and next time, do so before challenging points you know little about.</p>

<p>Researchers even admit that if we can get a fully functional quantum computer have a limited scope in applications, more specifically those that would require huge amounts of calculations, such as encryption and research projects.
Since we’re only at 14 qubits now, it would take decades more to get it to a more useful number, and we can’t even keep coherence long enough to make calculation or anything with 14 qubits, I doubt that it would become a major part of EE/CS anytime soon. (btw, I found a free version of the 14-qubit paper if you’re interested :stuck_out_tongue: <a href=“http://arxiv.org/PS_cache/arxiv/pdf/1009/1009.6126v2.pdf[/url]”>http://arxiv.org/PS_cache/arxiv/pdf/1009/1009.6126v2.pdf&lt;/a&gt;)</p>

<p>

</p>

<p>

</p>

<p>Yikes, sorry that I was off by four. I do know enough about quantum computing to know that being off by four is more important than you’d think, but I don’t know enough about quantum computing to look at that and say something significant about that progress towards creating a useful machine.</p>

<p>I don’t know that much about quantum computing, but what I’ve heard is that it is predicted to be utilized on specialized machines that solve a select number of problems. The technology won’t go into general-purpose programmable computers. If this is true, then there won’t be that much design and manufacturing of these systems.</p>

<p>If I’m wrong about this, I’d be interested in hearing why.</p>

<p>

</p>

<p>To me it sounds like you are splitting hairs were, but maybe I just am unable to appreciate this nuance. Sorry I misunderstood you. But I think that it is pretty weak reason to be optimistic about quantum computing. Electronic switching circuits are such a successful technology, and hoping that every technology experiences the same growth that it does (which is all you are appealing to here) isn’t a strong reason in my opinion.</p>

<p>

</p>

<p>This doesn’t mean that you are knowledgeable about quantum computing. I am interested in what you have to say if plan on studying the subject and/or have studied the subject though.</p>

<p>I’m also curious if you have anything to back this prediction:</p>

<p>

</p>