32
loading...
This website collects cookies to deliver better user experience
@step
def normalize(images: np.ndarray) -> np.ndarray:
"""Normalize images so the values are between 0 and 1."""
return images / 255.0
@step
above the normalization function? That's all that was needed to transform this into a ZenML step that can be used in all your pipelines.@step
def load_data() -> np.ndarray:
...
@pipeline
def load_and_normalize_pipeline(
load_data_step,
normalize_step,
):
# Connect the inputs and outputs of our pipeline steps
images = load_data_step()
normalize_step(images=images)
# Create and run our pipeline
load_and_normalize_pipeline(load_data(), normalize()).run()
![]() |
---|
Figure 1: Example stacks for local development (left) and production using Apache Airflow and GCP (right) |
zenml stack set production_stack
# Get a pipeline from our ZenML repository
pipeline = Repository().get_pipeline(pipeline_name="my_pipeline")
# Get the latest run of our pipeline
pipeline_run = pipeline.runs[-1]
# Get a specific step of the pipeline run
evaluation_step = pipeline_run.get_step(name="evaluation_step")
# Use the step parameters or outputs
class_weights = evaluation_step.parameters["class_weights"]
evaluation_accuracy = evaluation_step.output.read()