Quick Start

Let’s attempt to setup the Tergite stack to run on a simulator on your local machine.

We will not need an actual quantum computer.

Prerequisites

You may have to install these software if you don’t have them already installed.

Setup the Frontend

  • Ensure you have docker is running.
docker --help
git clone https://github.com/tergite/tergite-frontend.git
  • Enter the tergite-frontend folder
cd tergite-frontend
  • Create an mss-config.toml file with visual studio code (or any other text editor).
code mss-config.toml
  • Update the mss-config.toml with the following content
# mss-config.toml

# general configurations
[general]
# the port on which MSS is running
mss_port = 8002
# the port on which the websocket is running
ws_port = 6532
# environment reflect which environment the app is to run in.
environment = "development"
# the host the uvicorn runs on.
# During testing auth on 127.0.0.1, set this to "127.0.0.1". default: "0.0.0.0"
mss_host = "127.0.0.1"

[database]
# configurations for the database
name = "testing"
# database URI
# host.docker.internal resolves to the host's 127.0.0.1
# see https://stackoverflow.com/questions/31324981/how-to-access-host-port-from-docker-container#answer-43541732
url = "mongodb://host.docker.internal:27017"

[[backends]]
name = "qiskit_pulse_1q"
# the URL where this backend is running
# host.docker.internal resolves to the host's 127.0.0.1
# see https://stackoverflow.com/questions/31324981/how-to-access-host-port-from-docker-container#answer-43541732
url = "http://host.docker.internal:8000"

[auth]
# turn auth OFF or ON, default=true
is_enabled = false
cookie_domain = "127.0.0.1"
cookie_name = "tergiteauth"

[[auth.clients]]
name = "github"
client_id = "some-github-obtained-client-id"
client_secret = "some-github-obtained-client-secret"
redirect_url = "http://127.0.0.1:8002/auth/app/github/callback"
client_type = "github"
email_regex = "^(john\\.doe|jane|aggrey)@example\\.com$"
email_domain = "example.com"
roles = ["admin", "user"]

[[auth.clients]]
name = "puhuri"
client_id = "some-puhuri-obtained-client-id"
client_secret = "some-puhuri-obtained-client-secret"
redirect_url = "http://127.0.0.1:8002/auth/app/puhuri/callback"
client_type = "openid"
email_regex = "^(john\\.doe|jane)@example\\.com$"
email_domain = "example.com"
roles = ["user"]
openid_configuration_endpoint = "https://proxy.acc.puhuri.eduteams.org/.well-known/openid-configuration"

# Puhuri synchronization
# Puhuri is a resource management platform for HPC systems, that is also to be used for Quantum Computer's
[puhuri]
# turn puhuri synchronization OFF or ON, default=true
is_enabled = false
  • Create a .env file with visual studio code (or any other text editor).
code .env
  • Update the .env with the following content
# .env

# required
ENVIRONMENT="development"
MSS_V2_API_URL="http://127.0.0.1:8002/v2"
GRAFANA_LOKI_URL=http://127.0.0.1:3100/loki/api/v1/push
LOKI_LOGGER_ID=some-generic-id

# docker LOGGING_DRIVER can be journald, json-file, local etc. 
LOGGING_DRIVER=json-file
# image versions:
# Note: If you ever want the images to be rebuilt, 
# you have to change the app version numbers here 
# before running "docker compose up"
MSS_VERSION=v0.0.1
DASHBOARD_VERSION=v0.0.1
PROMTAIL_VERSION=2.8.3
  • Open the Mongo compass application and connect to the default local mongo database

  • Create a new mongo database called “testing” that contains a “backends” collection.

  • Delete the old docker images of “tergite/tergite-mss”, “tergite/tergite-dashboard” from docker if they exist.
docker rmi tergite/tergite-mss:v0.0.1
docker rmi tergite/tergite-dashboard:v0.0.1
  • To Run the services, use the fresh-docker-compose.yml.
docker compose -f fresh-docker-compose.yml up -d
  • Remove any stale artefacts created during the docker build
docker system prune
docker compose -f fresh-docker-compose.yml ps
  • To stop the services, run:
docker compose -f fresh-docker-compose.yml stop
  • To remove stop the services and remove their containers also, run:
docker compose -f fresh-docker-compose.yml down
  • To view logs of the docker containers to catch some errors, use:
docker compose -f fresh-docker-compose.yml logs -f

see more at https://docs.docker.com/reference/cli/docker/compose/logs/

  • Ensure that the services are running. If they are not, restart them.
docker compose -f fresh-docker-compose.yml up -d

Setup the Backend

  • Ensure you have conda installed. (You could simply have python +3.9 installed instead.)
  • Ensure you have the Redis server running.
redis-server
git clone https://github.com/tergite/tergite-backend.git
  • Create conda environment
conda create -n bcc -y python=3.9
conda activate bcc
  • Install dependencies
cd tergite-backend
pip install -r requirements.txt
  • Create an .env file with visual studio code (or any other text editor).
code .env
  • Update the .env file with the following content.
APP_SETTINGS=development
IS_AUTH_ENABLED=False

DEFAULT_PREFIX=qiskit_pulse_1q
STORAGE_ROOT=/tmp
LOGFILE_DOWNLOAD_POOL_DIRNAME=logfile_download_pool
LOGFILE_UPLOAD_POOL_DIRNAME=logfile_upload_pool
JOB_UPLOAD_POOL_DIRNAME=job_upload_pool
JOB_PRE_PROC_POOL_DIRNAME=job_preproc_pool
JOB_EXECUTION_POOL_DIRNAME=job_execution_pool

# Main Service Server
MSS_MACHINE_ROOT_URL=http://localhost:8002
MSS_PORT=8002

# Backend Control computer
BCC_MACHINE_ROOT_URL=http://localhost:8000
BCC_PORT=8000

EXECUTOR_TYPE=qiskit_pulse_1q
  • Create a backend_config.toml file with visual studio code (or any other text editor).
code backend_config.toml
  • Update the backend_config.toml file with the following content.
# backend_config.toml
[general_config]
name = "qiskit_pulse_1q"
version = "1.0.0"
is_active = true
characterized = true
open_pulse = true
simulator = true
online_date = "2024-09-11T18:07:31"
num_qubits = 1
num_couplers = 0
num_resonators = 1
description = "A single transmon Hamiltonian with 4 levels"
dt = 1e-9
dtm = 1e-9

[device_config]
discriminators = [ "lda" ]

qubit_ids = [ "q0" ]

coordinates = [ [ 0, 0 ] ]

coupling_map = [[ 0, 0 ]]

meas_map = [ [ 0 ] ]

qubit_parameters = [
  "id",
  "x_position",
  "y_position",
  "xy_drive_line",
  "z_drive_line",
  "frequency",
  "pi_pulse_amplitude",
  "pi_pulse_duration",
  "pulse_type",
  "pulse_sigma",
  "t1_decoherence",
  "t2_decoherence"
]
resonator_parameters = [
  "id",
  "x_position",
  "y_position",
  "readout_line",
  "acq_delay",
  "acq_integration_time",
  "frequency",
  "pulse_delay",
  "pulse_duration",
  "pulse_type",
  "pulse_amplitude"
]

coupler_parameters = [
  "id",
  "x_position",
  "y_position",
  "xy_drive_line",
  "z_drive_line",
  "frequency",
  "pi_pulse_amplitude",
  "pi_pulse_duration",
  "pulse_type",
  "pulse_sigma",
  "t1_decoherence",
  "t2_decoherence"
]

[device_config.discriminator_parameters]
lda = [
  "coef_0",
  "coef_1",
  "intercept"
]

[gates.x]
qasm_def = "gate x q { U(pi, 0, pi) q; }"
parameters = [ ]
coupling_map = [[0]]

[simulator_config.units.qubit]
frequency = "Hz"
t1_decoherence = "s"
t2_decoherence = "s"

[simulator_config.units.readout_resonator]
acq_delay = "s"
acq_integration_time = "s"
frequency = "Hz"
pulse_delay = "s"
pulse_duration = "s"

[simulator_config.units.discriminators]
coef_0 = ""  
coef_1 = ""
intercept = ""

[[simulator_config.qubit]]
frequency = 4_700_000_000
pi_pulse_amplitude = 0.05
pi_pulse_duration = 5.6e-9
pulse_sigma = 7e-9
pulse_type = "Gaussian"
t1_decoherence = 0.000071
t2_decoherence = 0.000069
id = "q0"

[[simulator_config.readout_resonator]]
acq_delay = 5e-8
acq_integration_time = 0.000001
frequency = 7_260_080_000
pulse_delay = 0
pulse_duration = 9e-7
pulse_type = "Square"
pulse_amplitude = 0.1266499392606423
id = "q0"


[simulator_config.discriminators.lda.q0]
intercept = -38.4344477840827
coef_0 = -98_953.87504155144
coef_1 = -114_154.48696231026
  • Run start script
./start_bcc.sh

Run an Experiment

  • Open another terminal

  • Create a new folder “tergite-test” and enter it

mkdir tergite-test
cd tergite-test
  • Create conda environment and activate it
conda create -n tergite -y python=3.9
conda activate tergite
pip install qiskit
pip install tergite
  • Create a file main.py with visual studio code (or any other text editor).
code main.py
  • Update the main.py file with the following content:
# main.py
"""A sample script doing a very simple quantum operation"""
import time

import qiskit.circuit as circuit
import qiskit.compiler as compiler

from tergite.qiskit.providers import Job, Tergite
from tergite.qiskit.providers.provider_account import ProviderAccount

if __name__ == "__main__":
    # the Tergite API URL
    API_URL = "http://localhost:8002"
    # The name of the Quantum Computer to use from the available quantum computers
    BACKEND_NAME = "qiskit_pulse_1q"
    # the name of this service. For your own bookkeeping.
    SERVICE_NAME = "local"
    # the timeout in seconds for how long to keep checking for results
    POLL_TIMEOUT = 100

    # create the Qiskit circuit
    qc = circuit.QuantumCircuit(1)
    qc.x(0)
    qc.h(0)
    qc.measure_all()

    # create a provider
    # provider account creation can be skipped in case you already saved
    # your provider account to the `~/.qiskit/tergiterc` file.
    # See below how that is done.
    account = ProviderAccount(service_name=SERVICE_NAME, url=API_URL)
    provider = Tergite.use_provider_account(account)
    # to save this account to the `~/.qiskit/tergiterc` file, add the `save=True`
    # provider = Tergite.use_provider_account(account, save=True)

    # Get the Tergite backend in case you skipped provider account creation
    # provider = Tergite.get_provider(service_name=SERVICE_NAME)
    backend = provider.get_backend(BACKEND_NAME)
    backend.set_options(shots=1024)

    # compile the circuit
    tc = compiler.transpile(qc, backend=backend)

    # run the circuit
    job: Job = backend.run(tc, meas_level=2, meas_return="single")

    # view the results
    elapsed_time = 0
    result = None
    while result is None:
        if elapsed_time > POLL_TIMEOUT:
            raise TimeoutError(
                f"result polling timeout {POLL_TIMEOUT} seconds exceeded"
            )

        time.sleep(1)
        elapsed_time += 1
        result = job.result()

    print(result.get_counts())
  • Execute the above script by running the command below.
python main.py
  • It should return something like:
Results OK
{'0': 776, '1': 248}
Back to top