Backend

System design of the Tergite backend

Flows

Tergite backend flows

Before arrival

The Tergite SDK is used by the experimantalist to specify the quantum circuits or the pulses to run on the quantum computer, sending these instructions in OpenPulse format to the Tergite backend.

On arrival

The Tergite backend receives the experiments as an HDF5 file sent to its RESTful API. It passes the file over to the quantum jobs queue for processing and execution.

In the quantum jobs queue

Job Registration worker

In order to keep track of the experiment(s) while in the queue, it is registered in the database of Tergite backend. It is then passed onto the next worker: the preprocessing worker.

Preprocessing worker

Currently, this worker does nothing of significance. The experiment(s) is then passed onto the execution worker.

Execution worker

The experiment(s) are then converted from OpenPulse into the format that the control instruments of the quantum computer understand and then forwarded to the control instruments to be executed.

The results are then passed on to the postprocessing worker.

Postprocessing worker

In case the final results were expected to be discriminated, this worker discriminates them. Discriminating is the transformation of raw measurement values into ones and zeroes (and twos)

The worker saves the results in the database and then sends them to the public API app so that the experimentalist can query it from there.

Special Techniques

Transpiling OpenPulse

The real control devices in current use are produced by Qblox. However, the simulators (COMING SOON) are developed on top of the qiskit dynamics.

There is therefore two types of native instrument targets to transpile OpenPulse to.

Real Device (Qblox)

Qblox instruments can be controlled using the quantify python packages produced by Qblox itself. Transpilation is therefore from OpenPulse to quantify’s Schedule.

TODO

Simulator (qiskit dynamics)

TODO

Back to top