I have been many things.

They all lead to who I am now.
Xandra Granade (she/they), storyteller.


You Can't Program Quantum Computers With Python (Part II)

Earlier, I shared Part I of my spicy take that "you can't program quantum computers with Python," focusing on ways to make the popular dichotomy between interpreted and compiled languages more precise and relevant. In particular, I made the claim that what was more relevant than being "compiled" was providing minimal runtime dependencies and providing strong design-time safety.

At least at the outset, that sounds like it's quite contradictory to how Python works as a language, and yet Python seems to be used quite well to program quantum devices --- what gives? In this part, I'll explore that tension by looking at two more issues. First, there's a difference between using Python as a programming language and using Python as a metaprogramming framework. Second, people mostly aren't really writing quantum programs --- not yet.

Programming and Metaprogramming

Consider a Python snippet like the following:

def f(x):
    return x + 2

This would, at the outset, seem to be perfectly clear: it defines a function f that takes an argument x and adds it to the constant value 2. In that sense, we're using Python as a programming language; that is, as a language to define a computer program for running on a classical computer.

We can do something different, though, with the same kind of function definition, but passing in something other than a number. Let's make a couple new data classes and see what we can do with them:

from __future__ import annotations
from dataclasses import dataclass

@dataclass
class Var:
    name: str

    def __add__(self, other: Expr) -> Expr:
        return PlusExpr(self, other)

    # Needed so we can override what happens when we add Var
    # to a number, something like an int or a float.
    def __radd__(self, other: Expr) -> Expr:
        return PlusExpr(other, self)

@dataclass
class PlusExpr:
    left: Expr
    right: Expr

    def __add__(self, other: Expr) -> Expr:
        return PlusExpr(self, other)

    def __radd__(self, other: Expr) -> Expr:
        return PlusExpr(other, self)

NumberLiteral = int | float
Expr = Var | NumberLiteral | PlusExpr

(For the rest of the post, I'll quote snippets of this example rather than showing the full thing. If you want to see the entire example, check out this gist on GitHub.)

Now, if we call f with an instance of Var, what we get is a kind of description of a program:

>>> x = Var("x")
>>> f(x)
PlusExpr(left=Var(name='x'), right=2)

Critically, when we call f, our program doesn't actually add anything at all, but only generates a description of a program. We could later use that description to actually run our program:

>>> y = 2 * x + 1
>>> evaluate(y, x=3)
7
>>> evaluate(y, x=4)
9

Effectively, what we've done is to build up a Python object representing a program rather than build a program directly. If this sounds familiar, that's the same technique used by TensorFlow and other machine learning libraries for Python to compile expressions to various accelerator backends. Having the complete structure of an algebraic expression as a Python object makes it much easier to target different backends, but it also makes it much easier to manipulate expressions and perform different "passes." You can even do things like take the derivative of an expression:

>>> x = Var("x")
>>> y = Var("y")
>>> z = 2 * x * x + 3 * x * y + (-4) * y
>>> simplify(derivative(z, x))
PlusExpr(left=PlusExpr(left=TimesExpr(left=2, right=Var(name='x')), right=TimesExpr(left=2, right=Var(name='x'))), right=TimesExpr(left=3, right=Var(name='y')))
>>> evaluate(derivative(z, x), x=3, y=4)
24
>>> evaluate(4 * x + 3 * y, x=3, y=4)
24

This only works because z is a Python object that we can manipulate, call methods on, and that we can modify. In general, programming techniques that work by manipulating and transforming other programs are known as metaprogramming techniques; here, we've used Python's operator overloading as a basic kind of metaprogramming, but many other common techniques such as templates/generics, macros, and code generation broadly fall under the term metaprogramming.

By contrast, if z was declared as a Python function — if we programmed z directly — we'd have limited access to its internal structure and would only be able to run z as a function (this is only mostly true, given that Python includes a disassembler, but that's a far more complicated approach to metaprogramming than what we're concerned with here).

One common application of metaprogramming is to quickly design new programming languages embedded in some host language. These new languages, sometimes called embedded domain-specific languages or embedded DSLs, borrow syntax from their host, but apply that syntax in distinct enough ways that programming in the embedded DSL can feel quite distinct from programming in the host language.

Suppose, for instance, that we want to generate some HTML programmatically using Python. We could then consider making an HTML-like language that embeds into Python — let's call it PML for Python Markup Language.

>>> html = PmlNodeKind("html")
>>> body = PmlNodeKind("body")
>>> p = PmlNodeKind("p")
>>> a = PmlNodeKind("a")
>>> html(
...     body(
...         p("Hello, world!"),
...         p(
...             "Click ",
...             a("here", href="http://example.com"),
...             " to learn more."
...         )
...     )
... ).to_html()
'<html><body><p>Hello, world!</p><p>Click <a href="http://example.com">here</a> to learn more.</p></body></html>'

Here, function calls no longer mean what they usually mean in Python; rather, we've reused function calls to mean something much more declarative. In particular, p("Hello!") isn't read as "run a function p with "Hello!" as its argument," but as "add a new p tag with "Hello!" as its contents." Behind the scenes, the PmlNode class overloads __call__ to implement that declarative meaning, but writing code in PML no longer really feels like Python.

Many quantum libraries use this trick to build up circuits as well — let's look at a quick sample of building a quantum circuit using QuTiP:

import qutip_qip as qp
import qutip_qip.circuit
circ = qp.circuit.QubitCircuit(2)
circ.add_gate("SNOT", 0)
circ.add_gate("CNOT", 1, 0)

The same way as we used our little PML example to generate HTML, QuTiP can generate OpenQASM 2.0 text from a circuit description:

>>> print(qp.qasm.circuit_to_qasm_str(circ))
// QASM 2.0 file generated by QuTiP

OPENQASM 2.0;
include "qelib1.inc";

qreg q[2];

h q[0];
cx q[0],q[1];

Using this kind of approach, we can build up simple circuits like quantum teleportation. This time, let's use Qiskit to give it a try:

qubits = QuantumRegister(3)
classical_bits = [ClassicalRegister(1) for _ in range(2)]
circ = qk.QuantumCircuit(qubits, *classical_bits)
prepare_entangled_state(circ, 1, 2)
unprepare_entangled_state(circ, 0, 1)
circ.measure(0, 0)
circ.measure(1, 1)
circ.z(2).c_if(classical_bits[0], 1)
circ.x(2).c_if(classical_bits[1], 1)

What's going on with that .c_if method? That gets to the heart of the difference between using Python to write quantum programs and using Python to implement an embedded DSL for writing quantum programs. The two approaches look very similar when we're writing quantum circuits, but couldn't be more different when we're writing quantum programs.

Quantum Circuits versus Quantum Programs

Wait, but aren't quantum circuits the same as quantum programs? No, not really — circuits consist of the special case of nonadaptive quantum programs. That is, quantum programs where the list of quantum instructions to be executed is fixed, and does not depend on the outcomes of quantum measurements. Some circuit representations, such as that used by OpenQASM 2.0, include some small special cases of adaptivity, such as the teleportation example above, but for the most part, circuits are an almost vanishingly small subset of quantum programs in general. Due to hardware limitations with most prototype devices up to this point, though, circuits have been where the vast majority of the effort in programming quantum devices has focused so far.

More generally, quantum circuits are interesting subroutines in larger quantum programs that include lots of control flow, including branching on the results of quantum measurement. To represent that control flow in an embedded DSL using metaprogramming, we have a challenge that we can't actually rely on the host language for control flow.

If we could, we might expect something like the following to work:

# WARNING: this snippet is not valid!
qubits = QuantumRegister(3)
classical_bits = [ClassicalRegister(1) for _ in range(2)]
circ = qk.QuantumCircuit(qubits, *classical_bits)
prepare_entangled_state(circ, 1, 2)
unprepare_entangled_state(circ, 0, 1)
circ.measure(0, 0)
circ.measure(1, 1)
if classical_bits[0] == 1:
    circ.z(2)
if classical_bits[1] == 1:
    circ.x(2)

Indeed, that's closer to how standalone domain specific languages (that is, DSLs that aren't embedded into host languages) such as Q# or OpenQASM 3.0 represent conditional quantum operations. In Qiskit and other embedded DSLs, though, the if keyword is taken by the host language, not the embedded language. In the above attempt, what we actually get is a Python program that either generates a quantum program with a z gate acting on qubit 2, or generates a quantum program without that gate. That is, the if statement is resolved when we generate the quantum program, not when we run it.

Instead, Qiskit provides a c_if method that transforms part of a quantum program into a new quantum program that includes a classical condition, similar to how our earlier derivative function transformed one program into another. Other embedded DSLs, such as PyQuil, provide methods such as if_then and while_do to emit if-conditions or while-loops into quantum programs.

There are some ways around this challenge; for example, QCOR uses Python's built-in disassembler to turn Python code into quantum programs:

@qjit
def qpe(q : qreg):
    ...
    for i in range(bitPrecision):
        for j in range(1<<i):
            oracle.ctrl(q[i], q)

In general, though, implementing an embedded DSL for quantum programming within Python means that Python syntax is reserved for metaprogramming, and you need to come up with new ways of expressing programming constructs like loops and conditions.

Conclusions

With all that given, we now have enough to come back to the original spicy take and cool it down a bit. You can't use Python to write quantum programs, but you absolutely can use Python to write embedded programming languages that you can then use to write quantum programs. That does present some challenges compared to standalone languages like Q# and OpenQASM 3.0, but at the same time, it can be a good path for bringing the power and flexibility of Python into quantum programming as a metaprogramming engine.

Quantum Folks, We Need to Talk About January 6th

Recently, Congress held a hearing on expanding the National Quantum Initiative (NQI) Act. On its own, this doesn't sound all that notable — new research and new technologies means new funding and new budgets, after all. Given the level of hype in quantum computing over the past few years, expanding the NQI isn't all that surprising.

At the same time, go read that first sentence again. Does it jump out at you that this is a discussion playing out in the US Congress, out of all political venues in the United States? Maybe not, as much of the rhythm of politics-as-usual plays out with the banging of gavels in hearing after hearing. Sitting in 2023, however, we are very far from politics-as-usual, so maybe the word "Congress" should jump out at you.

Maybe it is significant that the future of quantum investments by the United States government is being made in the same halls that recently saw a manufactured debt crisis nearly shut down the entire government. Maybe it's more significant still that the hearing was held in the US House, the same august body currently debating things like how much to restrict abortion healthcare, whether or not to impeach cabinet members for spurious right-wing fever dreams, or something incomprehensible about gas stoves. Maybe it's more significant that the hearing on funding for quantum computing research and development was held in the same halls that, almost 30 months ago, were home to a vote to overturn democracy in the United States. Maybe it's more significant that the hearing was chaired by Representative Frank Lucas, who on January 6th, sided with insurrectionists by voting to invalidate the election in Arizona.

Maybe it's significant that as Rep. Lucas regurgitated empty rhetoric about China's funding for quantum computing (making sure to get the word "communist" in there for Fox News viewers at home), he stood in the same building where a two and a half years ago he tried to help a violent mob bring an end to free and fair elections in his own country.

It's tempting to treat quantum computing as some neutral thing, a technology without any moral values of its own. Perhaps it's even true, but that need not imply that researchers, developers, technical writers, project managers, or anyone else in the quantum research community or the quantum industry should act without morals. When someone like Rep. Frank Lucas stands up and tells us that he "cannot overstate the importance of maintaining the U.S. competitive advantage in quantum capabilities," we need to take him seriously, including the full and complete context of whom Rep. Lucas considers to be American, or even human.

Representative Lucas voted against COVID-19 relief, against providing a legal immigration path for DREAMers, against the right of same-sex or mixed-race couples to get married, against the freedom to vote (twice, even!), against healthcare for women and pregnant people more generally, against protections for Native Americans and LGBTQIA+ individuals who have been subjected to domestic violence, and against LGBTQIA+ rights more generally. Most critically of all, however, is indeed still his vote on January 6, 2021. As GovTrack.us notes:

Lucas was among the Republican legislators who participated in the attempted coup. On January 6, 2021 in the hours after the violent insurrection at the Capitol, Lucas voted to reject the state-certified election results of Arizona and/or Pennsylvania (states narrowly won by Democrats), which could have changed the outcome of the election. These legislators pumped the lies and preposterous legal arguments about the election that motivated the January 6, 2021 violent insurrection at the Capitol. The January 6, 2021 violent insurrection at the Capitol, led on the front lines by militant white supremacy groups, attempted to prevent President-elect Joe Biden from taking office by disrupting Congress’s count of electors.

Had Rep. Lucas and his effective collaborators scaling the walls outside the Capitol succeeded, we might well be dealing with a second Trump term despite the will of the voters. We can't know for certain what such a term would look like, but both in his time in office and in his social media posts since, Trump himself has given a pretty good clue that a second term would be horrific for queer people, people of color, people with disabilities, women and nonbinary people, and would be especially horrifying for anyone whose identity intersects multiple modes of oppression.

When Rep. Lucas tells us that "quantum computers have vast, untapped potential for both good and evil, which is why it’s so important that we stay ahead of our adversaries on these technologies," maybe we should consider whom Rep. Lucas considers his adversaries to be. At least for myself, I would posit that he made that much clear on January 6, 2021.

Have You Considered Egg?

I promise I'll post part II of my recent spicy quantum computing take soon, but in the meantime, I wanted to share a small bit of surreal flash fiction that I wrote recently. Thanks for reading, and I'll see you (approximately) next week!


I kind of just stared at it for a while. It was sitting there on the plate, right next to the bacon, the hashbrowns, the toast, the coffee, and the orange juice, just like it always did. I used to take it for granted that that's what you did with an egg: stare at it, sitting on the plate in its shell. Day after day, I would stop in at the café on my way to work, order the breakfast plate, and look at the egg as I carefully ate around it, not wanting to disturb it as it sat there, fragile but still unbroken. Day after day, I finished my meal and carefully slid the plate back towards the edge of the booth, put some cash down on the white hand-written bill, and left before any other customers arrived.

Except one day, that wasn't how it went at all.

I looked down at my watch; the flashing display read 8:53am, reminding me that I was currently missing my morning status meeting. I hadn't intended to be late, but once I saw far the detour took me out of my way, I made sure to text in and offer my apologies. Not that I got a response, of course, but I hoped it was enough to at least buy some time to eat. There's a few routines that you can't really mess with, after all.

The egg just sat there on my plate as always, oblivious to how late it was, and to the din that had picked up in the café. I was far from the only customer for a change, and the whole space seemed to fill with life, accompanied by the bustle and noise that followed life wherever it went. Tentatively, I scooped my fork into the hash, but it didn't taste the same, the simple flavors of oil, starch, and salt mixing in my senses with the heavy aroma of coffee and stale air. It even looked different --- the golden brown shreds on the tarnished fork caught not only the warm yellow fluorescent light overhead, but also the sunlight that peeked in through the layers of adhesive caked on the window, a palimpsest of the different advertisements that had hung there over the past half-century. Less brown, more golden and vibrant, reflecting the fervor around it.

I chewed as best as I could, choking down just how overwhelmed I was. Stolen snippets of conversation intruded into my mind, threatening to pilfer my own thoughts as well. The potatoes were just a touch shy of burnt, as always, but their soft crunch became sharp and unsettling as I ate, even the texture of my breakfast turning against me.

Frustrated, I paused, looking up at the other diners and wishing they would stop with their noise, their smell, their movement --- all taking up more and more space in my brain. This was, or at least was supposed to be, the one moment in my day when I could just be, not have to process so many different senses to simply exist.

The egg just sat there, as always, still oblivious.

In the booth across from me, there was a man in a gray suit, a bit badly fitting and poorly pressed. What social obligation was he performing with such perfunctory and superficial compliance? The egg on my plate didn't know, or if it did, its uniform white shell betrayed no sign of comprehension, any more than its matte texture seemed aware of the sunlight. Everything else on my plate had changed, but the egg --- my egg --- just sat there.

The man in the suit brought a fork to his mouth, covered in something brilliant and yellow, but what? Not hash, not any selection from a fruit cup, not any part of a pie that I could recognize. I thought through the whole contents of the menu, recalling the contents of each different dish. Every day I looked at each option before deciding, as always, on my one-egg breakfast. There was nothing for it, though, no other options on the menu. I looked over to confirm, hoping I wasn't too obvious as I noted the bacon, hash browns, and toast on his plate. No doubt about it, process of elimination told me that had to be his egg. Not white, but yellow. Not solid, but oozing down his fork. Not matte, but almost glimmering.

Was was his egg so different than mine? My egg sat there, not offering any answers at all. (Rude.) I picked my egg up, ignoring the bacon to inspect every point on its surface, studying the way it refused to even acknowledge the sunlight that danced across every other thing I could see.

I was deep in my reverie when the waitress came by to refill my coffee. The dark liquid, almost a thin tar in its viscosity, flowed into my mug on its own schedule, lapping slightly at the edges as it settled into the ceramic. The sound startled me, giving the egg --- *my *egg --- an opportunity to escape. It dropped onto my plate with a soft click, a crack spreading across its surface.

This had never happened before. Eggs weren't supposed to break. But there it was, caring more for what my hard and yellowed plate had to say about matters than for my own need to have something, anything, stay the same.

As I watched, the crack spread further. Something shiny eked its way out of the shell, dripping onto the plate. I bent down to look, and saw

My childhood room, my furniture, my toys, my young body sitting on my carpet, reading one of my books. No, not a book that I knew of, but something different. As a child, I'd been obsessed with what I would eventually understand to be civil engineering, always asking questions about who made roads, who drew the shape of those roads across the landscape, who strung bridges across chasms. The cover of this book was different, full of little blue balls flying about other balls, stylized atoms and molecules. I blinked, and the book was full of animals, then plants, then swords and suits of armor, then ancient columns holding up ancient roofs. I blinked again and saw

My office, my desk, my papers, my computer. Just like the book that wasn't my book, I saw oozing out of the egg a different office, with a typewriter, with a shelf full of brown and green hardcover books, with a dizzying array of green and black circuit boards. I saw

The city skyline from my balcony. Paris, Sydney, Tokyo, and Cairo all spread across my plate as I watched my egg spill its contents over glistening Mediterranean sand, over fields amber and verdant. I saw

My closet, full of clothes not my own. T-shirts that were just "t-shirts," and not "fitted." Practical but uninspiring shoes. A dozen copies of the same slacks. High-visibility vests. Tuxedos. Leather straps. Logo-emblazoned polos. A flapper dress. I saw

My world, but not my world. What my world could have been, what it could still be. The possibilities I had left sitting inside each and every intact egg, day after day. The worlds I had sent back to the kitchen each morning. The lives I had been afraid to let out, to reflect the sunlight, to mix with the smell of coffee, sweat, and hope, to abandon its own shape in favor of the plate or the toast or the hash.

I took a bite.

You Can't Program Quantum Computers With Python (Part I)

I'll warrant that "you can't program quantum computers with Python" is a spicy take on quantum computing, given the prevalence of Python-based toolchains for quantum computing — everything from QuTiP through to Qiskit offer to allow Python users a way to write and run quantum programs. At the same time it's not that hot a take, as the same qualities that prevent using Python to write quantum programs perhaps paradoxically also make Python a great language for quantum computing.

To reconcile those two seemingly opposite claims, we'll need to take a tour through two long-running dichotomies in classical software development: interpreted versus compiled languages, and programming versus metaprogramming. That will necessarily be a bit long for a single post, so let's dive in with a discussion of compiled versus interpreted languages.

Compiled and Interpreted Languages

Generally, when we talk about programming languages, folks tend to separate them into compiled languages like C, C++, Rust, and Go, or interpreted languages like Python and JavaScript. If you ask about how to classify languages like C# or Java that use a virtual machine to interpret intermediate-level bytecode at runtime, you'll get different answer depending on the biases and preferences of whomever you ask. Add just-in-time compilation into the mix, and you'll as often or not get demure mumbles followed by a sudden shift to talking about the weather.

Taxonomy is hard, and fitting all languages into one of two buckets is one of the hardest taxonomical debates we run into in classical software development. So let's approach the problem with overwhelming and embarrassing levels of hubris, and simply solve it: all programming languages are interpreted, whether or not they involve the use of a compiler. The interpreter may be built into your CPU, or might be a complex userspace application, but it exists nonetheless. A sequence of bytes is always meaningless on its own, without reference to a particular device or application interpreting them as instructions.

Rather, I'd posit that when people refer to the division between compiled and interpreted languages, a large part of the confusion stems from that there's actually two distinct technical evaluations (at least!) being juggled behind the scenes: how complex are the runtime requirements for a language, and what design-time safety does a language provide? Both of these questions are quite tied up in the task of programming quantum computers, to put it mildly.

Runtime Dependencies

When you write in a language like C, you have access to the _C Standard Library _, a set of functions and data types like fopen for opening files, or strncpy for copying n bytes from one part of memory to another. Except when you don't. That standard library needs to exist at runtime, leading to a variety of different implementations being made available, including glibc, muslc, Android's libbionic, and Microsoft's Universal C Runtime. In some cases, such as working with embedded microcontrollers or other low-power devices, those kinds of implementations might not make sense, such that you might not have a standard library available at all. As a result, the programs you write in C may have more or less runtime requirements, depending on what capabilities you assume and what kinds of devices you're trying to work with.

Traditionally, languages that we think of as being interpreted tend to have much heavier runtime requirements than compiled languages. JavaScript programs require either a browser or an engine like NodeJS to run (except when they don't), while Python carries the entire Python interpreter and its rather large standard library as dependencies.

Except when it doesn't. The MicroPython project provides an extremely lightweight implementation of the Python interpreter and an optional compiler, allowing Python to be used on small, low-power microcontrollers. The tradeoff is that, just as programming in C without the standard library is harder than programming with it, the version of Python recognized by MicroPython is a strict subset of the Python most people are used to working with, leading to a number of important differences.

Even MicroPython, though, is likely more than what can be reasonably run on the classical parts of a quantum device, especially when considering the extremely strict latency requirements imposed by coherence times. While there's no fixed set of runtime dependencies associated with a language, the fact that Python is very dynamic makes it difficult to fit its dependencies within the exacting requirements of quantum execution.

What do we mean by dynamic, though? Luckily, there's more to this post!

Design-Time Safety

In my previous post on types and typeclasses, I highlighted the role that types can play in checking the correctness of programs. To use an example from that post, consider passing an invalid input to a Python function that squares its argument:

>>> def square(x):
...     return x * x
...
>>> print(square("circle"))
Traceback (most recent call last):
  File "<stdin>", line 1, in <module>
  File "<stdin>", line 2, in square
TypeError: can't multiply sequence by non-int of type 'str'

Generally, if you make a logic error of this form in Python, it gets caught when you run the program. On the other hand, if I try to write the same thing in Rust, I get an error when I compile the program, complete with suggestions as to how to fix it:

$ cat src/main.rs
fn square<T: Num + Copy>(x: T) -> T {
    x * x
}

fn main() {
    println!("{}", square("hello"));
}
$ cargo build
error[E0277]: the trait bound `&str: Num` is not satisfied
 --> src/main.rs:8:27
  |
8 |     println!("{}", square("hello"));
  |                    ------ ^^^^^^^ the trait `Num` is not implemented for `&str`
  |                    |
  |                    required by a bound introduced by this call
  |
  = help: the following other types implement trait `Num`:
            BigInt
            BigUint
            Complex<T>
            Ratio<T>
            Wrapping<T>
            f32
            f64
            i128
          and 11 others
note: required by a bound in `square`
 --> src/main.rs:3:14
  |
3 | fn square<T: Num + Copy>(x: T) -> T {
  |              ^^^ required by this bound in `square`

For more information about this error, try `rustc --explain E0277`.

That isn't to say that Rust is better, so much as that it provides different tradeoffs. While Python is far more flexible, and requires developers to specify a lot less information up front, that also means that there's less information available to validate programs as we write them instead of when we run them. The dichotomy isn't as strict as that, of course, thanks to design-time validators for Python like Pylint and Mypy, but it is generally true that languages like Python provide less design-time guarantees while languages like Rust provide more.

Much of that difference stems from the fact that in Python, types are dynamic, meaning that they are a runtime property of values. We can even check what type something is at runtime and make decisions accordingly:

>>> if random.randrange(2):
...     x = 42
... else:
...     x = "the answer"
...
>>> type(x)
<class 'str'>
>>> print("str" if isinstance(x, str) else "not str")
str

Here again, the dichotomy between dynamic and static type systems is a bit hard to pin down, given that languages like C# and Java support reflection as a way to make runtime decisions about types that would normally be static, and that polymorphism in C++ allows for some amount of dynamism subject to an inheritance bound. Even Rust has the dyn keyword for building a dynamic type out of a static typeclass bound. We can broadly say, though, that Python has a much more dynamic type system than most languages.

The practical effects of that dynamism are far-reaching, but what concerns us in this post is the impact on quantum computing: it's difficult to use types and other design-time programming concepts to make conclusions about what a block of Python code will do when we run it. That works well when computing time is cheaper than developer time, such that we can run and test code to ensure its validity, but works dramatically less well when computing time is expensive, as in the case of a quantum device.

Next time on...

In order to be useful for programming quantum computers, we want a language that is easy to use with minimal to no runtime dependencies, and that provides strong design-time guarantees as to correctness. Put together, exploring these two dichotomies tells us that we want something that looks more like what often gets called a compiled language, even if taking the compiled-vs-interpreted taxonomy literally isn't the most useful.

That then leaves the question as to what that "compiled" language should look like, if not Python. In the next post, I'll try to answer that by arguing that one very good alternative to using Python to program quantum computers is... to use Python.