Resonator Spectroscopy with DC Bias¶
Prerequisites¶
This guide assumes you have a configured DeviceSetup as well as Qubit objects with assigned parameters. Please see our tutorials if you need to create your setup and qubits for the first time. However, you can also run this notebook "as is" using an emulated session. If you are just getting started with the LabOne Q Applications Library, please don't hesitate to reach out to us at info@zhinst.com.
Background¶
In this notebook, we will learn how to extend the workflow of resonator spectroscopy to include the sweep of the dc bias provided by a third party instruments. In this tutorial, we will in order do the following:
- Extend a Tunable Transmon functionality to include the property of DC bias
- Learn how to create a new Quantum Operation to sweep the bias how a Tunable Transmon and include it in our standard set of operations
- Learn how to manipulate third-party devices using neartime-callback functions inside Quantum Operations
- Build a new Workflow that exploit the new functionalities
Imports¶
You'll start by importing laboneq.simple and a demo QPU and device setup to run in emulation mode.
from __future__ import annotations
import numpy as np
from laboneq.simple import *
Define your experimental setup¶
Let's define our experimental setup. We will need:
a set of TunableTransmonOperations
a QPU
Here, we will be brief. We will mainly provide the code to obtain these objects. To learn more, check out these other tutorials:
We will use 3 TunableTransmonQubits in this guide. Change this number to the one describing your setup.
number_of_qubits = 3
DeviceSetup¶
This guide requires a setup that can drive and readout tunable transmon qubits. Your setup could contain an SHFQC+ instrument, or an SHFSG and an SHFQA instruments. Here, we will use an SHFQC+ with 6 signal generation channels and a PQSC.
If you have used LabOne Q before and already have a DeviceSetup for your setup, you can reuse that.
If you do not have a DeviceSetup, you can create one using the code below. Just change the device numbers to the ones in your rack and adjust any other input parameters as needed.
# Setting get_zsync=True below, automatically detects the zsync ports of the PQCS that
# are used by the other instruments in this descriptor.
# Here, we are not connected to instruments, so we set this flag to False.
from laboneq.contrib.example_helpers.generate_descriptor import generate_descriptor
descriptor = generate_descriptor(
pqsc=["DEV10001"],
shfqc_6=["DEV12001"],
number_data_qubits=number_of_qubits,
multiplex=True,
number_multiplex=number_of_qubits,
include_cr_lines=False,
get_zsync=False, # set to True when at a real setup
ip_address="localhost",
)
setup = DeviceSetup.from_descriptor(descriptor, "localhost")
Qubits¶
We will generate 3 TunableTransmonQubits from the logical signal groups in our DeviceSetup. The names of the logical signal groups, q0, q1, q2, will be the UIDs of the qubits. Moreover, the qubits will have the same logical signal lines as the ones of the logical signal groups in the DeviceSetup.
from laboneq_applications.qpu_types.tunable_transmon import TunableTransmonQubit
qubits = TunableTransmonQubit.from_device_setup(setup)
for q in qubits:
print("-------------")
print("Qubit UID:", q.uid)
print("Qubit logical signals:")
for sig, lsg in q.signals.items():
print(f" {sig:<10} ('{lsg:>10}')")
Configure the qubit parameters to reflect the properties of the qubits on your QPU using the following code:
for q in qubits:
q.parameters.ge_drive_pulse["sigma"] = 0.25
q.parameters.readout_amplitude = 0.5
q.parameters.reset_delay_length = 1e-6
q.parameters.readout_range_out = -25
q.parameters.readout_lo_frequency = 7.4e9
qubits[0].parameters.drive_lo_frequency = 6.4e9
qubits[0].parameters.resonance_frequency_ge = 6.3e9
qubits[0].parameters.resonance_frequency_ef = 6.0e9
qubits[0].parameters.readout_resonator_frequency = 7.0e9
qubits[1].parameters.drive_lo_frequency = 6.4e9
qubits[1].parameters.resonance_frequency_ge = 6.5e9
qubits[1].parameters.resonance_frequency_ef = 6.3e9
qubits[1].parameters.readout_resonator_frequency = 7.3e9
qubits[2].parameters.drive_lo_frequency = 6.0e9
qubits[2].parameters.resonance_frequency_ge = 5.8e9
qubits[2].parameters.resonance_frequency_ef = 5.6e9
qubits[2].parameters.readout_resonator_frequency = 7.2e9
Quantum Operations¶
Create the set of TunableTransmonOperations:
from laboneq_applications.qpu_types.tunable_transmon import TunableTransmonOperations
qops = TunableTransmonOperations()
QPU¶
Create the QPU object from the qubits and the quantum operations
from laboneq.dsl.quantum import QPU
qpu = QPU(qubits, quantum_operations=qops)
Alternatively, load from a file¶
If you you already have a DeviceSetup and a QPU stored in .json files, you can simply load them back using the code below:
from laboneq import serializers
setup = serializers.load(full_path_to_device_setup_file)
qpu = serializers.load(full_path_to_qpu_file)
qubits = qpu.quantum_elements
qops = qpu.quantum_operations
Connect to Session¶
session = Session(setup)
session.connect(do_emulation=True) # do_emulation=False when at a real setup
Create a FolderStore for Saving Data¶
The experiment Workflows can automatically save the inputs and outputs of all their tasks to the folder path we specify when instantiating the FolderStore. Here, we choose the current working directory.
# import FolderStore from the `workflow` namespace of LabOne Q, which was imported
# from `laboneq.simple`
from pathlib import Path
folder_store = workflow.logbook.FolderStore(Path.cwd())
We disable saving in this guide. To enable it, simply run folder_store.activate().
folder_store.deactivate()
Optional: Configure the LoggingStore¶
You can also activate/deactivate the LoggingStore, which is used for displaying the Workflow logging information in the notebook; see again the tutorial on Recording Experiment Workflow Results for details.
Displaying the Workflow logging information is activated by default, but here we deactivate it to shorten the outputs, which are not very meaningful in emulation mode.
We recommend that you do not deactivate the Workflow logging in practice.
from laboneq.workflow.logbook import LoggingStore
logging_store = LoggingStore()
logging_store.deactivate()
Configure the setup for using an external DC Bias¶
Set the relevant qubit parameters¶
First, let's set the relevant qubit parameters for controlling an external DC voltage source. The TunableTransmon object already posses two parameters to describe the DC bias of the qubit:
- dc_slot to describe to which channel the qubit is connected to
- dc_voltage_parking to describe how much voltage the channel should provide.
for n, qubit in enumerate(qubits):
qubit.parameters.dc_slot = n
qubit.parameters.dc_voltage_parking = 0
Above we used as convention that the qubit are connected to the channel in order, and we set their initial value of the DC bias to 0 Volt. Let's look at a qubit to see how this object changed:
print(qubits[2].parameters)
Create a Quantum Operation to set the DC bias for a qubit¶
Next, we want to create a Quantum Operations to set the DC bias of a qubit. To do this, we will follow these steps:
- Create a function to set the DC in a third-party instrument
- Register the function to our session with a standard name
- Create a Quantum Operation that uses the function from the session
- Register the new quantum operation in our platform
Create a function to set the DC bias of qubit¶
Next, we need to provide a function to set the DC bias of a particular qubit. The exact form of this function will depend on the driver of the device that is to be used, for the purpose of this tutorial, we will just use a mock to show how this is done. Following the prescription provided in the documentation of the Neartime-Callback Functions and 3rd-Party Devices chapter, the first argument of such function should always be the session.
def my_setter(
session: Session,
voltage,
channel,
):
return f"channel={channel}, voltage={voltage:.1f}"
Register function in the session and create a matching Quantum Operation¶
The next two steps will be performed together, we do this to be sure that there is a proper match between the function register and the Quantum Operation that uses it. Thanks to the quantum operation, we can be sure that this function will be used with the correct parameters relevant to a particular qubit each time. Notice how we allow the voltage to be selected optionally by the user. This allow us to override the voltage stored in the qubit parameter if it is needed for particular reasons, for example to sweep it in the context of an experiment.
func_id = "set_dc_bias"
session.register_neartime_callback(my_setter, func_id)
@qpu.quantum_operations.register
@dsl.quantum_operation(neartime=True)
def set_dc_bias(
self,
qubit,
voltage=None,
):
# fetch parameters
## voltage if not provided
if voltage is None:
voltage = qubit.parameters.dc_voltage_parking
## channel always provided by the qubit
channel = qubit.parameters.dc_slot
# call the function
dsl.call(func_id, voltage=voltage, channel=channel)
let's convince ourselves that the operation is there by inspecting the source code:
qpu.quantum_operations.set_dc_bias.src
Create a new Workflow using the new Quantum Operation¶
We are now set to include the new Quantum Operation in a workflow of our choice. For the purpose of this exercise, let's create a simplified version of a pulsed resonator spectroscopy were the DC bias is swept together with the frequency sent to the measure line.
Creating a new Workflow requires the tasks compile_experiment and run_experiment. Let's import them:
from laboneq.workflow.tasks import (
compile_experiment,
run_experiment,
)
Now let's write the experiment using the dsl. We used a code similar to the resonator_spectroscopy_amplitude but simpler. The main change is that now instead of sweeping the amplitude we use our new Quantum Operation for changing the DC bias, and we pass a sweep parameter to it.
@workflow.task
@dsl.qubit_experiment
def create_experiment(
qpu,
qubit,
frequencies,
voltages,
):
with dsl.sweep(
parameter=SweepParameter(f"voltages{qubit.uid}", voltages),
) as voltage:
qpu.quantum_operations.set_dc_bias(qubit, voltage=voltage) # set dc bias here
with dsl.acquire_loop_rt(
count=100,
averaging_mode=AveragingMode.SEQUENTIAL,
acquisition_type=AcquisitionType.SPECTROSCOPY,
):
with dsl.sweep(
name=f"freq_{qubit.uid}",
parameter=SweepParameter(f"frequencies_{qubit.uid}", frequencies),
) as frequency:
qpu.quantum_operations.set_frequency(
qubit,
frequency=frequency,
readout=True
)
qpu.quantum_operations.measure(
qubit,
dsl.handles.result_handle(qubit.uid)
)
qpu.quantum_operations.delay(qubit, 1e-6)
Now we are ready to define a new Workflow we the help of standard imported tasks like compile_experiment and run_experiment together with create_experiment.
We also add the analysis workflow for extracting the parking readout-resonator frequency and voltage provided by the laboneq-applications package.
Finally, we use the update_qpu task also provided by the laboneq-applications package to update the qubit parameters readout_resonator_frequency and dc_parking_voltage with the values extracted from the analysis.
from laboneq_applications.analysis.resonator_spectroscopy_dc_bias import (
analysis_workflow,
)
from laboneq_applications.tasks import update_qpu
@workflow.workflow_options
class ResonatorSpectroscopyDcBiasWokflowOptions:
do_analysis: bool = workflow.option_field(
default=True,
description="Whether to run the analysis workflow."
)
update: bool = workflow.option_field(
default=False,
description="Whether to update the qubit parameters."
)
@workflow.workflow
def res_spec_with_dc_workflow(
session,
qpu,
qubit,
frequencies,
voltages,
options: ResonatorSpectroscopyDcBiasWokflowOptions | None = None,
):
# create experiment
exp = create_experiment(
qpu,
qubit,
frequencies=frequencies,
voltages=voltages,
)
# compile it
compiled_exp = compile_experiment(session, exp)
# run it
result = run_experiment(session, compiled_exp)
with workflow.if_(options.do_analysis):
# run the analysis
analysis_results = analysis_workflow(
result=result,
qubit=qubit,
frequencies=frequencies,
voltages=voltages
)
with workflow.if_(options.update):
# update qubit parameters
qubit_parameters = analysis_results.output
update_qpu(qpu, qubit_parameters["new_parameter_values"])
workflow.return_(
experiment=exp,
compiled_experiment=compiled_exp,
result=result,
)
Run the experiment¶
First, let's set the option field update to True, and the field parking_sweet_spot to "lss" for "lower sweet spot", meaning that we want to choose the lower sweet spot as the parking position for the qubit. If the parameters of the lower sweet spot are extracted by the analysis routine, they will be used to update the qubit parameters.
There are many additional options field that come from the options defined in the analysis workflow, which are automatically extracted by the res_spec_with_dc_workflow upon parsing the logic inside it. For example, you could set options.cloe_figures(False) to show the figures in the kernel.
To learn more about the options-feature of workflow, check out this tutorial. To learn more about how to create options classes for your experiment tasks and workflows, check out this tutorial.
options = res_spec_with_dc_workflow.options()
options.update(True)
options.parking_sweet_spot("lss")
options
We are now good to go! Let's run of workflow with some test voltages. We should expect the printout of our mock function to appear with the voltages we passed.
my_workflow = res_spec_with_dc_workflow(
session=session,
qpu=qpu,
qubit=qubit,
frequencies=np.linspace(6.5e9, 7.5e9, 101),
voltages=np.linspace(0, 3, 7),
options=options
)
my_results = my_workflow.run()
To verify that the function was run, let's inspect the result object and check the neartime-callbacks
my_results.tasks["run_experiment"].output.neartime_callbacks["set_dc_bias"]
Let's check that the qubit parameters have been updated:
qubit.parameters.readout_resonator_frequency, qubit.parameters.dc_voltage_parking
These values come from the qubit_parameters dictionary returned by the analysis workflow:
my_results.tasks["analysis_workflow"].output
Add task to a workflow¶
Now that we convinced ourselves that this work, and used the previous experiment to find the optimal parking voltages for each qubit, we can expand this concept and use Task to define a calibration for the DC sources. First, let's set some mockup values for the voltage to the qubits in our platform.
voltages = [1, 1.3, 1.5, 1.2, 1.7, 0.9]
for voltage, qubit in zip(voltages, qpu.quantum_elements):
qubit.parameters.dc_voltage_parking = voltage
Next, let's define a Task to automatically set the correct DC biases. Some DC sources allow for parallel settings of these values, so let's explore this more general case to exploit this feature. We will prepare a dictionary with parameters starting from the QPU
@workflow.task
def set_all_dc(
qpu,
):
dc_dict = {}
for qubit in qpu.quantum_elements:
dc_dict[qubit.uid] = {
"channel": qubit.parameters.dc_slot,
"voltage": qubit.parameters.dc_voltage_parking,
}
# mocking the voltage settings
for key, value in dc_dict.items():
voltage = value["voltage"]
channel = value["channel"]
print(f"voltage {voltage} Volt was set in channel {channel} for qubit {key}")
Now let's assume we want to make sure all values are correctly set before we run another workflow. To do this, we use an existing workflow and we add the above task to perform this calibration
from laboneq_applications.experiments import amplitude_rabi, options
@workflow.workflow
def new_rabi_workflow(
session,
qpu,
qubits,
amplitudes,
options: options.TuneUpWorkflowOptions | None = None,
):
# calibrate dc sources
set_all_dc(qpu)
exp = amplitude_rabi.create_experiment(
qpu,
qubits,
amplitudes=amplitudes,
)
compiled_exp = compile_experiment(session, exp)
_result = run_experiment(session, compiled_exp)
Let's check that the printout is there by running the workflow!
new_workflow = new_rabi_workflow(
session,
qpu,
qpu.quantum_elements,
amplitudes=[np.linspace(0.1, 1.0, 11) for n in range(number_of_qubits)],
)
workflow_result = new_workflow.run()