A Dataproc job for running Apache PySpark
<https://spark.apache.org/docs/0.9.0/python-programming-guide.html>__
applications on YARN.
.. attribute:: main_python_file_uri
Required. The HCFS URI of the main Python file to use as the
driver. Must be a .py file.
Optional. HCFS file URIs of Python files to pass to the
PySpark framework. Supported file types: .py, .egg, and .zip.
Optional. HCFS URIs of files to be copied to the working
directory of Python drivers and distributed tasks. Useful for
naively parallel tasks.
Optional. A mapping of property names to values, used to
configure PySpark. Properties that conflict with values set by
the Dataproc API may be overwritten. Can include properties
set in /etc/spark/conf/spark-defaults.conf and classes in user
code.
Classes
PropertiesEntry
API documentation for dataproc_v1.types.PySparkJob.PropertiesEntry class.
[[["Easy to understand","easyToUnderstand","thumb-up"],["Solved my problem","solvedMyProblem","thumb-up"],["Other","otherUp","thumb-up"]],[["Hard to understand","hardToUnderstand","thumb-down"],["Incorrect information or sample code","incorrectInformationOrSampleCode","thumb-down"],["Missing the information/samples I need","missingTheInformationSamplesINeed","thumb-down"],["Other","otherDown","thumb-down"]],["Last updated 2024-11-01 UTC."],[],[]]