Quick Start with a dummy cluster

Let’s attempt to setup the Tergite stack to run on a dummy cluster on your local machine.

We will not need an actual quantum computer. Take note, however, that the dummy cluster only returns 0 in its results.

Prerequisites

You may have to install these software if you don’t have them already installed.

Setup the Frontend

  • Ensure you have docker is running.
docker --help
git clone https://github.com/tergite/tergite-frontend.git
  • Enter the tergite-frontend folder
cd tergite-frontend
  • Create an mss-config.toml file with visual studio code (or any other text editor).
code mss-config.toml
  • Update the mss-config.toml with the following content
# mss-config.toml

# general configurations
[general]
# the port on which MSS is running
mss_port = 8002
# the port on which the websocket is running
ws_port = 6532
# environment reflect which environment the app is to run in.
environment = "development"
# the host the uvicorn runs on.
# During testing auth on 127.0.0.1, set this to "127.0.0.1". default: "0.0.0.0"
mss_host = "127.0.0.1"

[database]
# configurations for the database
name = "testing"
# database URI
# host.docker.internal resolves to the host's 127.0.0.1
# see https://stackoverflow.com/questions/31324981/how-to-access-host-port-from-docker-container#answer-43541732
url = "mongodb://host.docker.internal:27017"

[[backends]]
name = "loke"
# the URL where this backend is running
# host.docker.internal resolves to the host's 127.0.0.1
# see https://stackoverflow.com/questions/31324981/how-to-access-host-port-from-docker-container#answer-43541732
url = "http://host.docker.internal:8000"

[auth]
# turn auth OFF or ON, default=true
is_enabled = false
cookie_domain = "127.0.0.1"
cookie_name = "tergiteauth"

[[auth.clients]]
name = "github"
client_id = "some-github-obtained-client-id"
client_secret = "some-github-obtained-client-secret"
redirect_url = "http://127.0.0.1:8002/auth/app/github/callback"
client_type = "github"
email_regex = "^(john\\.doe|jane|aggrey)@example\\.com$"
email_domain = "example.com"
roles = ["admin", "user"]

[[auth.clients]]
name = "puhuri"
client_id = "some-puhuri-obtained-client-id"
client_secret = "some-puhuri-obtained-client-secret"
redirect_url = "http://127.0.0.1:8002/auth/app/puhuri/callback"
client_type = "openid"
email_regex = "^(john\\.doe|jane)@example\\.com$"
email_domain = "example.com"
roles = ["user"]
openid_configuration_endpoint = "https://proxy.acc.puhuri.eduteams.org/.well-known/openid-configuration"

# Puhuri synchronization
# Puhuri is a resource management platform for HPC systems, that is also to be used for Quantum Computer's
[puhuri]
# turn puhuri synchronization OFF or ON, default=true
is_enabled = false
  • Create a .env file with visual studio code (or any other text editor).
code .env
  • Update the .env with the following content
# .env

# required
ENVIRONMENT="development"
MSS_V2_API_URL="http://127.0.0.1:8002/v2"
GRAFANA_LOKI_URL=http://127.0.0.1:3100/loki/api/v1/push
LOKI_LOGGER_ID=some-generic-id

# docker LOGGING_DRIVER can be journald, json-file, local etc.
LOGGING_DRIVER=json-file
# image versions:
# Note: If you ever want the images to be rebuilt,
# you have to change the app version numbers here
# before running "docker compose up"
MSS_VERSION=v0.0.1
DASHBOARD_VERSION=v0.0.1
PROMTAIL_VERSION=2.8.3
  • Open the Mongo compass application and connect to the default local mongo database

  • Create a new mongo database called “testing” that contains a “backends” collection.

  • Delete the old docker images of “tergite/tergite-mss”, “tergite/tergite-dashboard” from docker if they exist.
docker rmi tergite/tergite-mss:v0.0.1
docker rmi tergite/tergite-dashboard:v0.0.1
  • To Run the services, use the fresh-docker-compose.yml.
docker compose -f fresh-docker-compose.yml up -d
  • Remove any stale artefacts created during the docker build
docker system prune
docker compose -f fresh-docker-compose.yml ps
  • To stop the services, run:
docker compose -f fresh-docker-compose.yml stop
  • To remove stop the services and remove their containers also, run:
docker compose -f fresh-docker-compose.yml down
  • To view logs of the docker containers to catch some errors, use:
docker compose -f fresh-docker-compose.yml logs -f

see more at https://docs.docker.com/reference/cli/docker/compose/logs/

  • Ensure that the services are running. If they are not, restart them.
docker compose -f fresh-docker-compose.yml up -d

Setup the Backend

  • Ensure you have conda installed. (You could simply have python +3.9 installed instead.)
  • Ensure you have the Redis server running.
redis-server
git clone https://github.com/tergite/tergite-backend.git
  • Create conda environment
conda create -n bcc -y python=3.9
conda activate bcc
  • Install dependencies
cd tergite-backend
pip install -r requirements.txt
  • Create an .env file with visual studio code (or any other text editor).
code .env
  • Update .env file to have the following content
# .env
APP_SETTINGS=development
IS_AUTH_ENABLED=False

DEFAULT_PREFIX=loke
STORAGE_ROOT=/tmp
LOGFILE_DOWNLOAD_POOL_DIRNAME=logfile_download_pool
LOGFILE_UPLOAD_POOL_DIRNAME=logfile_upload_pool
JOB_UPLOAD_POOL_DIRNAME=job_upload_pool
JOB_PRE_PROC_POOL_DIRNAME=job_preproc_pool
JOB_EXECUTION_POOL_DIRNAME=job_execution_pool

# Main Service Server
MSS_MACHINE_ROOT_URL=http://localhost:8002
MSS_PORT=8002

# Backend Control computer
BCC_MACHINE_ROOT_URL=http://localhost:8000
BCC_PORT=8000

QUANTIFY_CONFIG_FILE=quantify-config.yml
EXECUTOR_TYPE=quantify
  • Create an quantify-config.yml file with visual studio code (or any other text editor).
code quantify-config.yml
  • Update the quantify-config.yml with the following content
# quantify-config.yml
clusters:
  - name: clusterA
    instrument_type: Cluster
    is_dummy: true
    ref: internal
    instrument_address: "192.0.2.141"

    modules:
      - name: "clusterA_module7"
        instrument_type: "QCM_RF"

        complex_outputs:
          - name: "complex_output_0"
            lo_freq: 4458000000
            dc_mixer_offset_I: 0
            dc_mixer_offset_Q: 0
            portclock_configs:
            - port: "drive0"
              clock: "d0"
              mixer_amp_ratio: 1
              mixer_phase_error_deg: 0

      - name: "clusterA_module8"
        instrument_type: "QCM_RF"

        complex_outputs:
          - name: "complex_output_0"
            lo_freq: 5110000000
            dc_mixer_offset_I: 0
            dc_mixer_offset_Q: 0
            portclock_configs:
            - port: "drive1"
              clock: "d1"
              mixer_amp_ratio: 1
              mixer_phase_error_deg: 0

      - name: "clusterA_module9"
        instrument_type: "QCM_RF"

        complex_outputs:
          - name: "complex_output_0"
            lo_freq: 4445000000
            dc_mixer_offset_I: 0
            dc_mixer_offset_Q: 0
            portclock_configs:
            - port: "drive2"
              clock: "d2"
              mixer_amp_ratio: 1
              mixer_phase_error_deg: 0


      - name: "clusterA_module17"
        instrument_type: "QRM_RF"

        complex_outputs:
          - name: "complex_output_0"
            lo_freq: 6838000000
            dc_mixer_offset_I: 0
            dc_mixer_offset_Q: 0
            portclock_configs:
            - port: "readout0"
              clock: "m0"
              mixer_amp_ratio: 1
              mixer_phase_error_deg: 0
            - port: "readout1"
              clock: "m1"
              mixer_amp_ratio: 1
              mixer_phase_error_deg: 0
            - port: "readout2"
              clock: "m2"
              mixer_amp_ratio: 1
              mixer_phase_error_deg: 0
  • Create a backend_config.toml file with visual studio code (or any other text editor).
code backend_config.toml
  • Update the backend_config.toml with the following content.
[general_config]
name = "loke"
description = "Backend for the simulator supporting qiskit-connector parsing for the release 2024.3"
characterized = true
open_pulse = true
version = "2024.04.0"
num_qubits = 5
num_couplers = 8
num_resonators = 5
simulator = true
online_date = "2022-04-13T19:50:31"
dt = 1e-9
dtm = 1e-9 

[device_config]
meas_map = [ [ 0, 1, 2, 3, 4 ] ]
coupling_map = [
  [ 0, 2 ],
  [ 2, 0 ],
  [ 1, 2 ],
  [ 2, 1 ],
  [ 2, 3 ],
  [ 3, 2 ],
  [ 2, 4 ],
  [ 4, 2 ]
]
coordinates = [
      [1, 1],
      [3, 1],
      [1, 2],
      [2, 2],
      [3, 2]
]
discriminators = [ "lda" ]
qubit_ids = [ "q0", "q1", "q2", "q3", "q4" ]
qubit_parameters = [
  "id",
  "x_position",
  "y_position",
  "xy_drive_line",
  "z_drive_line",
  "frequency",
  "pi_pulse_amplitude",
  "pi_pulse_duration",
  "pulse_type",
  "pulse_sigma",
  "t1_decoherence",
  "t2_decoherence"
]
resonator_parameters = [
  "id",
  "x_position",
  "y_position",
  "readout_line",
  "acq_delay",
  "acq_integration_time",
  "frequency",
  "pulse_delay",
  "pulse_duration",
  "pulse_type",
  "pulse_amplitude"
]

coupler_parameters = [
  "id",
  "x_position",
  "y_position",
  "xy_drive_line",
  "z_drive_line",
  "frequency",
  "pi_pulse_amplitude",
  "pi_pulse_duration",
  "pulse_type",
  "pulse_sigma",
  "t1_decoherence",
  "t2_decoherence"
]

[device_config.discriminator_parameters]
lda = [
  "coef_0",
  "coef_1",
  "intercept",
  "score"
]

[gates.u]
qubits = [ 0 ]
qasm_def = "gate id q { U(0, 0, 0) q; }"
parameters = [ ]

[gates.h]
qubits = [ 0 ]
qasm_def = "gate id q { U(0, 0, 0) q; }"
parameters = [ ]

[gates.x]
qubits = [ 0 ]
qasm_def = "gate id q { U(0, 0, 0) q; }"
parameters = [ ]

[simulator_config.units.qubit]
# configs for units
frequency = "Hz"
t1_decoherence = "s"
t2_decoherence = "s"

[simulator_config.units.readout_resonator]
acq_delay = "s"
acq_integration_time = "s"
frequency = "Hz"
pulse_delay = "s"
pulse_duration = "s"

[simulator_config.units.discriminators]

[[simulator_config.qubit]]
frequency = 4_511_480_043.556283
pi_pulse_amplitude = 0.17555712637424228
pi_pulse_duration = 5.6e-8
pulse_sigma = 7e-9 
pulse_type = "Gaussian"
t1_decoherence = 0.000034
t2_decoherence = 0.000033
id = "q0"

[[simulator_config.qubit]]
frequency = 4_677_112_343.360253
pi_pulse_amplitude = 0.17535338530538067
pi_pulse_duration = 5.6e-8
pulse_sigma = 7e-9 
pulse_type = "Gaussian"
t1_decoherence = 0.000034
t2_decoherence = 0.000033
id = "q1"

[[simulator_config.qubit]]
frequency = 5_770_226_599.80365
pi_pulse_amplitude = 0.17873594718151276
pi_pulse_duration = 5.6e-8
pulse_sigma = 7e-9 
pulse_type = "Gaussian"
t1_decoherence = 0.000034
t2_decoherence = 0.000033
id = "q2"

[[simulator_config.qubit]]
frequency = 6_856_217_811.995201
pi_pulse_amplitude = 0.17326197853513559
pi_pulse_duration = 5.6e-8
pulse_sigma = 7e-9 
pulse_type = "Gaussian"
t1_decoherence = 0.000034
t2_decoherence = 0.000033
id = "q3"

[[simulator_config.qubit]]
frequency = 6_701_096_836.557067
pi_pulse_amplitude = 0.16948867103728774
pi_pulse_duration = 5.6e-8
pulse_sigma = 7e-9 
pulse_type = "Gaussian"
t1_decoherence = 0.000034
t2_decoherence = 0.000033
id = "q4"

[[simulator_config.readout_resonator]]
acq_delay = 5e-8 
acq_integration_time = 0.000001
frequency = 7_260_080_000
pulse_delay = 0
pulse_duration = 9e-7 
pulse_type = "Square"
pulse_amplitude = 0.1266499392606423
id = "q0"

[[simulator_config.readout_resonator]]
acq_delay = 5e-8 
acq_integration_time = 0.000001
frequency = 7_380_000_000
pulse_delay = 0
pulse_duration = 9e-7 
pulse_type = "Square"
pulse_amplitude = 0.12660078572926436
id = "q1"

[[simulator_config.readout_resonator]]
acq_delay = 5e-8 
acq_integration_time = 0.000001
frequency = 7_502_000_000
pulse_delay = 0
pulse_duration = 9e-7 
pulse_type = "Square"
pulse_amplitude = 0.08245560237524203
id = "q2"

[[simulator_config.readout_resonator]]
acq_delay = 5e-8 
acq_integration_time = 0.000001
frequency = 7_712_000_000
pulse_delay = 0
pulse_duration = 9e-7 
pulse_type = "Square"
pulse_amplitude = 0.04188729430238
id = "q3"

[[simulator_config.readout_resonator]]
acq_delay = 5e-8 
acq_integration_time = 0.000001
frequency = 7_871_000_000
pulse_delay = 0
pulse_duration = 9e-7 
pulse_type = "Square"
pulse_amplitude = 0.05844291534543274
id = "q4"

[simulator_config.discriminators.lda.q0]
score = 0.985
intercept = -38.4344477840827
coef_0 = -98_953.87504155144
coef_1 = -114_154.48696231026

[simulator_config.discriminators.lda.q1]
score = 0.987
intercept = -42.05181160328822
coef_0 = -107_941.00358803963
coef_1 = -124_239.32054386326

[simulator_config.discriminators.lda.q2]
score = 0.9905
intercept = -22.684588212281916
coef_0 = -191_087.42493249022
coef_1 = -20_803.06874845618

[simulator_config.discriminators.lda.q3]
score = 0.8735
intercept = -1.933795064413808
coef_0 = -29_474.17108465108
coef_1 = 78_360.1067777809

[simulator_config.discriminators.lda.q4]
score = 0.96375
intercept = -6.6282190356967075
coef_0 = -106_998.96952984166
coef_1 = 66_774.10489889105
  • Run start script
./start_bcc.sh

Run an Experiment

  • Open another terminal

  • Create a new folder “tergite-test” and enter it

mkdir tergite-test
cd tergite-test
  • Create conda environment and activate it
conda create -n terg -y python=3.9
conda activate terg
pip install qiskit
pip install tergite
  • Create a file main.py with visual studio code (or any other text editor).
code main.py
  • Update the main.py file with the following content:
# main.py
"""A sample script doing a very simple quantum operation"""
import time

import qiskit.circuit as circuit
import qiskit.compiler as compiler

from tergite.qiskit.providers import Job, Tergite
from tergite.qiskit.providers.provider_account import ProviderAccount

if __name__ == "__main__":
    # the Tergite API URL
    API_URL = "http://localhost:8002"
    # The name of the Quantum Computer to use from the available quantum computers
    BACKEND_NAME = "loke"
    # the name of this service. For your own bookkeeping.
    SERVICE_NAME = "local"
    # the timeout in seconds for how long to keep checking for results
    POLL_TIMEOUT = 100

    # create the Qiskit circuit
    qc = circuit.QuantumCircuit(1)
    qc.x(0)
    qc.h(0)
    qc.measure_all()

    # create a provider
    # provider account creation can be skipped in case you already saved
    # your provider account to the `~/.qiskit/tergiterc` file.
    # See below how that is done.
    account = ProviderAccount(service_name=SERVICE_NAME, url=API_URL)
    provider = Tergite.use_provider_account(account)
    # to save this account to the `~/.qiskit/tergiterc` file, add the `save=True`
    # provider = Tergite.use_provider_account(account, save=True)

    # Get the Tergite backend in case you skipped provider account creation
    # provider = Tergite.get_provider(service_name=SERVICE_NAME)
    backend = provider.get_backend(BACKEND_NAME)
    backend.set_options(shots=1024)

    # compile the circuit
    tc = compiler.transpile(qc, backend=backend)

    # run the circuit
    job: Job = backend.run(tc, meas_level=2, meas_return="single")

    # view the results
    elapsed_time = 0
    result = None
    while result is None:
        if elapsed_time > POLL_TIMEOUT:
            raise TimeoutError(
                f"result polling timeout {POLL_TIMEOUT} seconds exceeded"
            )

        time.sleep(1)
        elapsed_time += 1
        result = job.result()

    print(result.get_counts())
  • Execute the above script by running the command below.
python main.py
  • It should return something like:
Results OK
{'0': 1024}

Note: We get only 0’s because we are using the dummy cluster from quantify scheduler

Back to top