You cannot select more than 25 topics Topics must start with a letter or number, can include dashes ('-') and can be up to 35 characters long.
flink/flink-python
Dian Fu 5c1d7e4fa0 [FLINK-15897][python] Defer the deserialization of the Python UDF execution results 5 years ago
..
bin [FLINK-15616][python] Move boot error messages from python-udf-boot.log to taskmanager's log file 5 years ago
dev [FLINK-14944][python] Fix pip install ConnectionResetError 5 years ago
docs [hotfix][python][docs] Fix python doc nav bar not showing and layout issue. 6 years ago
lib [FLINK-14557][python] Clean up the py4j package by removing the unused directory __MACOSX. 5 years ago
pyflink [FLINK-15487][table] Allow registering FLIP-65 functions in TableEnvironment 5 years ago
src [FLINK-15897][python] Defer the deserialization of the Python UDF execution results 5 years ago
MANIFEST.in [FLINK-12962][python] Allows pyflink to be pip installed. 6 years ago
README.md [FLINK-14509][python] Improve the README.md in flink-python to prepare for PyPI release. 5 years ago
pom.xml [FLINK-15338][python] Cherry-pick BEAM-9006#10462 to fix the TM Metaspace memory leak problem when submitting PyFlink UDF jobs multiple times. 5 years ago
setup.cfg [FLINK-12962][python] Allows pyflink to be pip installed. 6 years ago
setup.py [FLINK-15937][python] Update the Development Status to 5 - Production/Stable (#11028) 5 years ago
tox.ini [FLINK-15929][python] Update the version limit of grpcio to 1.26.0 5 years ago

README.md

Apache Flink

Apache Flink is a framework and distributed processing engine for stateful computations over unbounded and bounded data streams. Flink has been designed to run in all common cluster environments, perform computations at in-memory speed and at any scale.

Learn more about Flink at https://flink.apache.org/

Python Packaging

This packaging allows you to write Flink programs in Python, but it is currently a very initial version and will change in future versions.

In this initial version only Table API is supported, you can find the documentation at https://ci.apache.org/projects/flink/flink-docs-stable/dev/table/tableApi.html

The tutorial can be found at https://ci.apache.org/projects/flink/flink-docs-stable/tutorials/python_table_api.html

The auto-generated Python docs can be found at https://ci.apache.org/projects/flink/flink-docs-stable/api/python/

Python Requirements

Apache Flink Python API depends on Py4J (currently version 0.10.8.1), CloudPickle (currently version 1.2.2), python-dateutil(currently version 2.8.0) and Apache Beam (currently version 2.15.0).

Development Notices

Protobuf Code Generation

Protocol buffer is used in file flink_fn_execution_pb2.py and the file is generated from flink-fn-execution.proto. Whenever flink-fn-execution.proto is updated, please re-generate flink_fn_execution_pb2.py by executing:

python pyflink/gen_protos.py

PyFlink depends on the following libraries to execute the above script:

  1. grpcio-tools (>=1.3.5,<=1.14.2)
  2. setuptools (>=37.0.0)
  3. pip (>=7.1.0)

Running Test Cases

Currently, we use conda and tox to verify the compatibility of the Flink Python API for multiple versions of Python and will integrate some useful plugins with tox, such as flake8. We can enter the directory where this README.md file is located and run test cases by executing

./dev/lint-python.sh