Skip to main content

Hopsworks Python SDK to interact with Hopsworks Platform, Feature Store, Model Registry and Model Serving

Project description

Hopsworks Client

Hopsworks Community Hopsworks Documentation python PyPiStatus Scala/Java Artifacts Downloads Ruff License

hopsworks is the python API for interacting with a Hopsworks cluster. Don't have a Hopsworks cluster just yet? Register an account on Hopsworks Serverless and get started for free. Once connected to your project, you can:

  • Insert dataframes into the online or offline Store, create training datasets or serve real-time feature vectors in the Feature Store via the Feature Store API. Already have data somewhere you want to import, checkout our Storage Connectors documentation.
  • register ML models in the model registry and deploy them via model serving via the Machine Learning API.
  • manage environments, executions, kafka topics and more once you deploy your own Hopsworks cluster, either on-prem or in the cloud. Hopsworks is open-source and has its own Community Edition.

Our tutorials cover a wide range of use cases and example of what you can build using Hopsworks.

Getting Started On Hopsworks

Once you created a project on Hopsworks Serverless and created a new Api Key, just use your favourite virtualenv and package manager to install the library:

pip install "hopsworks[python]"

Fire up a notebook and connect to your project, you will be prompted to enter your newly created API key:

import hopsworks

project = hopsworks.login()

Feature Store API

Access the Feature Store of your project to use as a central repository for your feature data. Use your favourite data engineering library (pandas, polars, Spark, etc...) to insert data into the Feature Store, create training datasets or serve real-time feature vectors. Want to predict likelyhood of e-scooter accidents in real-time? Here's how you can do it:

fs = project.get_feature_store()

# Write to Feature Groups
bike_ride_fg = fs.get_or_create_feature_group(
  name="bike_rides",
  version=1,
  primary_key=["ride_id"],
  event_time="activation_time",
  online_enabled=True,
)

fg.insert(bike_rides_df)

# Read from Feature Views
profile_fg = fs.get_feature_group("user_profile", version=1)

bike_ride_fv = fs.get_or_create_feature_view(
  name="bike_rides_view",
  version=1,
  query=bike_ride_fg.select_except(["ride_id"]).join(profile_fg.select(["age", "has_license"]), on="user_id")
)

bike_rides_Q1_2021_df = bike_ride_fv.get_batch_data(
  start_date="2021-01-01",
  end_date="2021-01-31"
)

# Create a training dataset
version, job = bike_ride_fv.create_train_test_split(
    test_size=0.2,
    description='Description of a dataset',
    # you can have different data formats such as csv, tsv, tfrecord, parquet and others
    data_format='csv'
)

# Predict the probability of accident in real-time using new data + context data
bike_ride_fv.init_serving()

while True:
    new_ride_vector = poll_ride_queue()
    feature_vector = bike_ride_fv.get_online_feature_vector(
      {"user_id": new_ride_vector["user_id"]},
      passed_features=new_ride_vector
    )
    accident_probability = model.predict(feature_vector)

The API enables interaction with the Hopsworks Feature Store. It makes creating new features, feature groups and training datasets easy.

The API is environment independent and can be used in two modes:

  • Spark mode: For data engineering jobs that create and write features into the feature store or generate training datasets. It requires a Spark environment such as the one provided in the Hopsworks platform or Databricks. In Spark mode, HSFS provides bindings both for Python and JVM languages.

  • Python mode: For data science jobs to explore the features available in the feature store, generate training datasets and feed them in a training pipeline. Python mode requires just a Python interpreter and can be used both in Hopsworks from Python Jobs/Jupyter Kernels, Amazon SageMaker or KubeFlow.

Scala API is also available, here is a short sample of it:

import com.logicalclocks.hsfs._
val connection = HopsworksConnection.builder().build()
val fs = connection.getFeatureStore();
val attendances_features_fg = fs.getFeatureGroup("games_features", 1);
attendances_features_fg.show(1)

Machine Learning API

Or you can use the Machine Learning API to interact with the Hopsworks Model Registry and Model Serving. The API makes it easy to export, manage and deploy models. For example, to register models and deploy them for serving you can do:

mr = project.get_model_registry()
# or
ms = connection.get_model_serving()

# Create a new model:
model = mr.tensorflow.create_model(name="mnist",
                                   version=1,
                                   metrics={"accuracy": 0.94},
                                   description="mnist model description")
model.save("/tmp/model_directory") # or /tmp/model_file

# Download a model:
model = mr.get_model("mnist", version=1)
model_path = model.download()

# Delete the model:
model.delete()

# Get the best-performing model
best_model = mr.get_best_model('mnist', 'accuracy', 'max')

# Deploy the model:
deployment = model.deploy()
deployment.start()

# Make predictions with a deployed model
data = { "instances": [ model.input_example ] }
predictions = deployment.predict(data)

Usage

Usage data is collected for improving quality of the library. It is turned on by default if the backend is Hopsworks Serverless. To turn it off, use one of the following ways:

# use environment variable
import os
os.environ["ENABLE_HOPSWORKS_USAGE"] = "false"

# use `disable_usage_logging`
import hopsworks
hopsworks.disable_usage_logging()

The corresponding source code is in python/hopsworks_common/usage.py.

Tutorials

Need more inspiration or want to learn more about the Hopsworks platform? Check out our tutorials.

Documentation

Documentation is available at Hopsworks Documentation.

Issues

For general questions about the usage of Hopsworks and the Feature Store please open a topic on Hopsworks Community.

Please report any issue using Github issue tracking and attach the client environment from the output below to your issue:

import hopsworks
hopsworks.login()
print(hopsworks.get_sdk_info())

Contributing

If you would like to contribute to this library, please see the Contribution Guidelines.

Project details


Release history Release notifications | RSS feed

Download files

Download the file for your platform. If you're not sure which to choose, learn more about installing packages.

Source Distribution

hopsworks-4.5.0rc5.tar.gz (514.7 kB view details)

Uploaded Source

Built Distribution

If you're not sure about the file name format, learn more about wheel file names.

hopsworks-4.5.0rc5-py3-none-any.whl (692.7 kB view details)

Uploaded Python 3

File details

Details for the file hopsworks-4.5.0rc5.tar.gz.

File metadata

  • Download URL: hopsworks-4.5.0rc5.tar.gz
  • Upload date:
  • Size: 514.7 kB
  • Tags: Source
  • Uploaded using Trusted Publishing? No
  • Uploaded via: twine/6.0.1 CPython/3.10.12

File hashes

Hashes for hopsworks-4.5.0rc5.tar.gz
Algorithm Hash digest
SHA256 eff2785cc5af4ee5207f248801f2577678a3acc95729c3ce0311902dd6f5737d
MD5 44e76e7ed42e1d8e54b307318a9f50d6
BLAKE2b-256 8661b4411075a8fc20d6df49781cff3aa93eb7ac725bcbd0d7351d235e7db00e

See more details on using hashes here.

File details

Details for the file hopsworks-4.5.0rc5-py3-none-any.whl.

File metadata

  • Download URL: hopsworks-4.5.0rc5-py3-none-any.whl
  • Upload date:
  • Size: 692.7 kB
  • Tags: Python 3
  • Uploaded using Trusted Publishing? No
  • Uploaded via: twine/6.0.1 CPython/3.10.12

File hashes

Hashes for hopsworks-4.5.0rc5-py3-none-any.whl
Algorithm Hash digest
SHA256 ae8f65c64d6e5a3c3778b6cb5a1078097695177205ba54c61a4e9455192c5218
MD5 2574c8f167117225e3e1d5b6a1820a52
BLAKE2b-256 8119a590142a69bdf374def873461c8ba2bf1cd9edb11a50d713bba362956c18

See more details on using hashes here.

Supported by

AWS Cloud computing and Security Sponsor Datadog Monitoring Depot Continuous Integration Fastly CDN Google Download Analytics Pingdom Monitoring Sentry Error logging StatusPage Status page